-
Notifications
You must be signed in to change notification settings - Fork 514
AVFoundation tvOS xcode15.1 b2
Alex Soto edited this page Oct 26, 2023
·
1 revision
#AVFoundation.framework
diff -ruN /Applications/Xcode_15.0.0.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAsset.h /Applications/Xcode_15.1.0-beta2.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAsset.h
--- /Applications/Xcode_15.0.0.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAsset.h 2023-08-19 22:14:55
+++ /Applications/Xcode_15.1.0-beta2.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAsset.h 2023-10-17 02:38:19
@@ -40,7 +40,7 @@
In order to avoid blocking, clients can register their interest in particular keys and to become notified when their values become available. For further details, see AVAsynchronousKeyValueLoading.h. For clients who want to examine a subset of the tracks, metadata, and other parts of the asset, asynchronous methods like -loadTracksWithMediaType:completionHandler: can be used to load this information without blocking. When using these asynchronous methods, it is not necessary to load the associated property beforehand. Swift clients can also use the load(:) method to load properties in a type safe manner.
- On iOS, it is particularly important to avoid blocking. To preserve responsiveness, a synchronous request that blocks for too long (eg, a property request on an asset on a slow HTTP server) may lead to media services being reset.
+ On platforms other than macOS, it is particularly important to avoid blocking. To preserve responsiveness, a synchronous request that blocks for too long (eg, a property request on an asset on a slow HTTP server) may lead to media services being reset.
To play an instance of AVAsset, initialize an instance of AVPlayerItem with it, use the AVPlayerItem to set up its presentation state (such as whether only a limited timeRange of the asset should be played, etc.), and provide the AVPlayerItem to an AVPlayer according to whether the items is to be played by itself or together with a collection of other items. Full details available in AVPlayerItem.h and AVPlayer.h.
@@ -524,7 +524,7 @@
/*!
@property hasProtectedContent
@abstract Indicates whether or not the asset has protected content.
- @discussion Assets containing protected content may not be playable without successful authorization, even if the value of the "playable" property is YES. See the properties in the AVAssetUsability category for details on how such an asset may be used. On OS X, clients can use the interfaces in AVPlayerItemProtectedContentAdditions.h to request authorization to play the asset.
+ @discussion Assets containing protected content may not be playable without successful authorization, even if the value of the "playable" property is YES. See the properties in the AVAssetUsability category for details on how such an asset may be used. On macOS, clients can use the interfaces in AVPlayerItemProtectedContentAdditions.h to request authorization to play the asset.
*/
@property (nonatomic, readonly) BOOL hasProtectedContent
#if __swift__
diff -ruN /Applications/Xcode_15.0.0.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetResourceLoader.h /Applications/Xcode_15.1.0-beta2.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetResourceLoader.h
--- /Applications/Xcode_15.0.0.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetResourceLoader.h 2023-08-18 23:48:44
+++ /Applications/Xcode_15.1.0-beta2.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetResourceLoader.h 2023-10-17 03:02:21
@@ -374,7 +374,7 @@
@abstract The length of the data requested.
@discussion Note that requestsAllDataToEndOfResource will be set to YES when the entire remaining length of the resource is being requested from requestedOffset to the end of the resource. This can occur even when the content length has not yet been reported by you via a prior finished loading request.
When requestsAllDataToEndOfResource has a value of YES, you should disregard the value of requestedLength and incrementally provide as much data starting from the requestedOffset as the resource contains, until you have provided all of the available data successfully and invoked -finishLoading, until you have encountered a failure and invoked -finishLoadingWithError:, or until you have received -resourceLoader:didCancelLoadingRequest: for the AVAssetResourceLoadingRequest from which the AVAssetResourceLoadingDataRequest was obtained.
- When requestsAllDataToEndOfResource is YES and the content length has not yet been provided by you via a prior finished loading request, the value of requestedLength is set to NSIntegerMax. Starting in OS X 10.11 and iOS 9.0, in 32-bit applications requestedLength is also set to NSIntegerMax when all of the remaining resource data is being requested and the known length of the remaining data exceeds NSIntegerMax.
+ When requestsAllDataToEndOfResource is YES and the content length has not yet been provided by you via a prior finished loading request, the value of requestedLength is set to NSIntegerMax. Starting in macOS 10.11 and iOS 9.0, in 32-bit applications requestedLength is also set to NSIntegerMax when all of the remaining resource data is being requested and the known length of the remaining data exceeds NSIntegerMax.
*/
@property (nonatomic, readonly) NSInteger requestedLength;
diff -ruN /Applications/Xcode_15.0.0.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetWriterInput.h /Applications/Xcode_15.1.0-beta2.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetWriterInput.h
--- /Applications/Xcode_15.0.0.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetWriterInput.h 2023-08-18 23:48:43
+++ /Applications/Xcode_15.1.0-beta2.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAssetWriterInput.h 2023-10-17 02:00:31
@@ -37,7 +37,7 @@
AVAssetWriterInput also supports writing per-track metadata collections to the output file.
- As of OS X 10.10 and iOS 8.0 AVAssetWriterInput can also be used to create tracks that are not self-contained. Such tracks reference sample data that is located in another file. This is currently supported only for instances of AVAssetWriterInput attached to an instance of AVAssetWriter that writes files of type AVFileTypeQuickTimeMovie.
+ As of macOS 10.10 and iOS 8.0 AVAssetWriterInput can also be used to create tracks that are not self-contained. Such tracks reference sample data that is located in another file. This is currently supported only for instances of AVAssetWriterInput attached to an instance of AVAssetWriter that writes files of type AVFileTypeQuickTimeMovie.
*/
API_AVAILABLE(macos(10.7), ios(4.1), tvos(9.0)) API_UNAVAILABLE(watchos)
@interface AVAssetWriterInput : NSObject
@@ -306,7 +306,7 @@
If you are working with high bit depth sources the following yuv pixel formats are recommended when encoding to ProRes: kCVPixelFormatType_4444AYpCbCr16, kCVPixelFormatType_422YpCbCr16, and kCVPixelFormatType_422YpCbCr10. When working in the RGB domain kCVPixelFormatType_64ARGB is recommended. Scaling and color matching are not currently supported when using AVAssetWriter with any of these high bit depth pixel formats. Please make sure that your track's output settings dictionary specifies the same width and height as the buffers you will be appending. Do not include AVVideoScalingModeKey or AVVideoColorPropertiesKey.
- As of OS X 10.10 and iOS 8.0, this method can be used to add sample buffers that reference existing data in a file instead of containing media data to be appended to the file. This can be used to generate tracks that are not self-contained. In order to append such a sample reference to the track create a CMSampleBufferRef with a NULL dataBuffer and dataReady set to true and set the kCMSampleBufferAttachmentKey_SampleReferenceURL and kCMSampleBufferAttachmentKey_SampleReferenceByteOffset attachments on the sample buffer. Further documentation on how to create such a "sample reference" sample buffer can be found in the description of the kCMSampleBufferAttachmentKey_SampleReferenceURL and kCMSampleBufferAttachmentKey_SampleReferenceByteOffset attachment keys in the CMSampleBuffer documentation.
+ As of macOS 10.10 and iOS 8.0, this method can be used to add sample buffers that reference existing data in a file instead of containing media data to be appended to the file. This can be used to generate tracks that are not self-contained. In order to append such a sample reference to the track create a CMSampleBufferRef with a NULL dataBuffer and dataReady set to true and set the kCMSampleBufferAttachmentKey_SampleReferenceURL and kCMSampleBufferAttachmentKey_SampleReferenceByteOffset attachments on the sample buffer. Further documentation on how to create such a "sample reference" sample buffer can be found in the description of the kCMSampleBufferAttachmentKey_SampleReferenceURL and kCMSampleBufferAttachmentKey_SampleReferenceByteOffset attachment keys in the CMSampleBuffer documentation.
Before calling this method, you must ensure that the receiver is attached to an AVAssetWriter via a prior call to -addInput: and that -startWriting has been called on the asset writer. It is an error to invoke this method before starting a session (via -[AVAssetWriter startSessionAtSourceTime:]) or after ending a session (via -[AVAssetWriter endSessionAtSourceTime:]).
diff -ruN /Applications/Xcode_15.0.0.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAsynchronousKeyValueLoading.h /Applications/Xcode_15.1.0-beta2.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAsynchronousKeyValueLoading.h
--- /Applications/Xcode_15.0.0.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAsynchronousKeyValueLoading.h 2023-08-19 22:23:08
+++ /Applications/Xcode_15.1.0-beta2.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVAsynchronousKeyValueLoading.h 2023-10-17 23:59:53
@@ -32,7 +32,7 @@
1. First, determine whether the value for a given key is available using statusOfValueForKey:error:.
2. If a value has not been loaded yet, you can ask for to load one or more values and be notified when they become available using loadValuesAsynchronouslyForKeys:completionHandler:.
- Even for use cases that may typically support ready access to some keys (such as for assets initialized with URLs for files in the local filesystem), slow I/O may require AVAsset to block before returning their values. Although blocking may be acceptable for OS X API clients in cases where assets are being prepared on background threads or in operation queues, in all cases in which blocking should be avoided you should use loadValuesAsynchronouslyForKeys:completionHandler:. For iOS clients, blocking to obtain the value of a key synchronously is never recommended under any circumstances.
+ Even for use cases that may typically support ready access to some keys (such as for assets initialized with URLs for files in the local filesystem), slow I/O may require AVAsset to block before returning their values. Although blocking may be acceptable for macOS API clients in cases where assets are being prepared on background threads or in operation queues, in all cases in which blocking should be avoided you should use loadValuesAsynchronouslyForKeys:completionHandler:. For clients of platforms other than macOS, blocking to obtain the value of a key synchronously is never recommended under any circumstances.
*/
@protocol AVAsynchronousKeyValueLoading
@required
@@ -52,7 +52,7 @@
Blocking that may occur when calling the getter for any key should therefore be avoided in the general case by loading values for all keys of interest via -loadValuesAsynchronouslyForKeys:completionHandler: and testing the availability of the requested values before fetching them by calling getters.
- The sole exception to this general rule is in usage on Mac OS X on the desktop, where it may be acceptable to block in cases in which the client is preparing objects for use on background threads or in operation queues. On iOS, values should always be loaded asynchronously prior to calling getters for the values, in any usage scenario.
+ The sole exception to this general rule is in usage on macOS on the desktop, where it may be acceptable to block in cases in which the client is preparing objects for use on background threads or in operation queues. On platforms other than macOS, values should always be loaded asynchronously prior to calling getters for the values, in any usage scenario.
*/
- (AVKeyValueStatus)statusOfValueForKey:(NSString *)key error:(NSError * _Nullable * _Nullable)outError AVF_DEPRECATED_FOR_SWIFT_ONLY("Use status(of:) instead", macos(10.7, 13.0), ios(4.0, 16.0), tvos(9.0, 16.0), watchos(1.0, 9.0));
diff -ruN /Applications/Xcode_15.0.0.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureDevice.h /Applications/Xcode_15.1.0-beta2.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureDevice.h
--- /Applications/Xcode_15.0.0.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureDevice.h 2023-08-19 21:19:42
+++ /Applications/Xcode_15.1.0-beta2.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureDevice.h 2023-09-22 22:50:03
@@ -69,7 +69,7 @@
Instances of AVCaptureDevice can be used to provide media data to an AVCaptureSession by creating an AVCaptureDeviceInput with the device and adding that to the capture session.
*/
-API_AVAILABLE(macos(10.7), ios(4.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos)
+API_AVAILABLE(macos(10.7), ios(4.0), macCatalyst(14.0), tvos(17.0), visionos(1.0)) API_UNAVAILABLE(watchos)
@interface AVCaptureDevice : NSObject
{
@private
@@ -89,7 +89,7 @@
@discussion
This method returns an array of AVCaptureDevice instances for input devices currently connected and available for capture. The returned array contains all devices that are available at the time the method is called. Applications should observe AVCaptureDeviceWasConnectedNotification and AVCaptureDeviceWasDisconnectedNotification to be notified when the list of available devices has changed.
*/
-+ (NSArray<AVCaptureDevice *> *)devices API_DEPRECATED("Use AVCaptureDeviceDiscoverySession instead.", ios(4.0, 10.0), macos(10.7, 10.15), visionos(1.0, 1.0)) API_UNAVAILABLE(tvos);
++ (NSArray<AVCaptureDevice *> *)devices API_DEPRECATED("Use AVCaptureDeviceDiscoverySession instead.", ios(4.0, 10.0), macos(10.7, 10.15)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(tvos);
/*!
@method devicesWithMediaType:
@@ -104,7 +104,7 @@
@discussion
This method returns an array of AVCaptureDevice instances for input devices currently connected and available for capture that provide media of the given type. Media type constants are defined in AVMediaFormat.h. The returned array contains all devices that are available at the time the method is called. Applications should observe AVCaptureDeviceWasConnectedNotification and AVCaptureDeviceWasDisconnectedNotification to be notified when the list of available devices has changed.
*/
-+ (NSArray<AVCaptureDevice *> *)devicesWithMediaType:(AVMediaType)mediaType API_DEPRECATED("Use AVCaptureDeviceDiscoverySession instead.", ios(4.0, 10.0), macos(10.7, 10.15), visionos(1.0, 1.0)) API_UNAVAILABLE(tvos);
++ (NSArray<AVCaptureDevice *> *)devicesWithMediaType:(AVMediaType)mediaType API_DEPRECATED("Use AVCaptureDeviceDiscoverySession instead.", ios(4.0, 10.0), macos(10.7, 10.15)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(tvos);
/*!
@method defaultDeviceWithMediaType:
@@ -119,7 +119,7 @@
@discussion
This method returns the default device of the given media type currently available on the system. For example, for AVMediaTypeVideo, this method will return the built in camera that is primarily used for capture and recording. Media type constants are defined in AVMediaFormat.h.
*/
-+ (nullable AVCaptureDevice *)defaultDeviceWithMediaType:(AVMediaType)mediaType;
++ (nullable AVCaptureDevice *)defaultDeviceWithMediaType:(AVMediaType)mediaType API_UNAVAILABLE(visionos);
/*!
@method deviceWithUniqueID:
@@ -134,7 +134,7 @@
@discussion
Every available capture device has a unique ID that persists on one system across device connections and disconnections, application restarts, and reboots of the system itself. This method can be used to recall or track the status of a specific device whose unique ID has previously been saved.
*/
-+ (nullable AVCaptureDevice *)deviceWithUniqueID:(NSString *)deviceUniqueID;
++ (nullable AVCaptureDevice *)deviceWithUniqueID:(NSString *)deviceUniqueID API_UNAVAILABLE(visionos);
/*!
@property uniqueID
@@ -154,7 +154,7 @@
@discussion
The value of this property is an identifier unique to all devices of the same model. The value is persistent across device connections and disconnections, and across different systems. For example, the model ID of the camera built in to two identical iPhone models will be the same even though they are different physical devices.
*/
-@property(nonatomic, readonly) NSString *modelID;
+@property(nonatomic, readonly) NSString *modelID API_UNAVAILABLE(visionos);
/*!
@property localizedName
@@ -239,7 +239,7 @@
@discussion
An AVCaptureSession instance can be associated with a preset that configures its inputs and outputs to fulfill common use cases. This method can be used to determine if the receiver can be used in a capture session with the given preset. Presets are defined in AVCaptureSession.h.
*/
-- (BOOL)supportsAVCaptureSessionPreset:(AVCaptureSessionPreset)preset;
+- (BOOL)supportsAVCaptureSessionPreset:(AVCaptureSessionPreset)preset API_UNAVAILABLE(visionos);
/*!
@property connected
@@ -249,7 +249,7 @@
@discussion
The value of this property is a BOOL indicating whether the device represented by the receiver is connected and available for use as a capture device. Clients can key value observe the value of this property to be notified when a device is no longer available. When the value of this property becomes NO for a given instance, it will not become YES again. If the same physical device again becomes available to the system, it will be represented using a new instance of AVCaptureDevice.
*/
-@property(nonatomic, readonly, getter=isConnected) BOOL connected;
+@property(nonatomic, readonly, getter=isConnected) BOOL connected API_UNAVAILABLE(visionos);
/*!
@property inUseByAnotherApplication
@@ -289,7 +289,7 @@
@discussion
This property can be used to enumerate the formats natively supported by the receiver. The capture device's activeFormat property may be set to one of the formats in this array. Clients can observe automatic changes to the receiver's formats by key value observing this property.
*/
-@property(nonatomic, readonly) NSArray<AVCaptureDeviceFormat *> *formats API_AVAILABLE(ios(7.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(visionos);
+@property(nonatomic, readonly) NSArray<AVCaptureDeviceFormat *> *formats API_AVAILABLE(ios(7.0), macCatalyst(14.0), tvos(17.0), visionos(1.0));
/*!
@property activeFormat
@@ -320,7 +320,7 @@
Note that when configuring a session to use an active format intended for high resolution still photography and applying one or more of the following operations to an AVCaptureVideoDataOutput, the system may not meet the target framerate: zoom, orientation changes, format conversion.
*/
-@property(nonatomic, retain) AVCaptureDeviceFormat *activeFormat API_AVAILABLE(ios(7.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(visionos);
+@property(nonatomic, retain) AVCaptureDeviceFormat *activeFormat API_AVAILABLE(ios(7.0), macCatalyst(14.0), tvos(17.0), visionos(1.0));
/*!
@property activeVideoMinFrameDuration
@@ -343,7 +343,7 @@
When exposureMode is AVCaptureExposureModeCustom, setting the activeVideoMinFrameDuration affects max frame rate, but not exposureDuration. You may use setExposureModeCustomWithDuration:ISO:completionHandler: to set a shorter exposureDuration than your activeVideoMinFrameDuration, if desired.
*/
-@property(nonatomic) CMTime activeVideoMinFrameDuration API_AVAILABLE(ios(7.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(visionos);
+@property(nonatomic) CMTime activeVideoMinFrameDuration API_AVAILABLE(ios(7.0), macCatalyst(14.0), tvos(17.0), visionos(1.0));
/*!
@property activeVideoMaxFrameDuration
@@ -366,7 +366,7 @@
When exposureMode is AVCaptureExposureModeCustom, frame rate and exposure duration are interrelated. If you call setExposureModeCustomWithDuration:ISO:completionHandler: with an exposureDuration longer than the current activeVideoMaxFrameDuration, the activeVideoMaxFrameDuration will be lengthened to accommodate the longer exposure time. Setting a shorter exposure duration does not automatically change the activeVideoMinFrameDuration or activeVideoMaxFrameDuration. To explicitly increase the frame rate in custom exposure mode, you must set the activeVideoMaxFrameDuration to a shorter value. If your new max frame duration is shorter than the current exposureDuration, the exposureDuration will shorten as well to accommodate the new frame rate.
*/
-@property(nonatomic) CMTime activeVideoMaxFrameDuration API_AVAILABLE(macos(10.9), ios(7.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(visionos);
+@property(nonatomic) CMTime activeVideoMaxFrameDuration API_AVAILABLE(macos(10.9), ios(7.0), macCatalyst(14.0), tvos(17.0), visionos(1.0));
/*!
@property inputSources
@@ -407,10 +407,10 @@
AVCaptureDevicePositionUnspecified = 0,
AVCaptureDevicePositionBack = 1,
AVCaptureDevicePositionFront = 2,
-} API_AVAILABLE(macos(10.7), ios(4.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+} API_AVAILABLE(macos(10.7), ios(4.0), macCatalyst(14.0), tvos(17.0), visionos(1.0)) API_UNAVAILABLE(watchos);
-API_AVAILABLE(macos(10.7), ios(4.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos)
+API_AVAILABLE(macos(10.7), ios(4.0), macCatalyst(14.0), tvos(17.0), visionos(1.0)) API_UNAVAILABLE(watchos)
@interface AVCaptureDevice (AVCaptureDevicePosition)
/*!
@@ -641,7 +641,7 @@
If the application wishes to offer users a fully manual camera selection mode in addition to automatic camera selection, it is recommended to call setUserPreferredCamera: each time the user makes a camera selection, but ignore key-value observer updates to systemPreferredCamera while in manual selection mode.
*/
-@property(class, readonly, nullable) AVCaptureDevice *systemPreferredCamera API_AVAILABLE(macos(13.0), ios(17.0), macCatalyst(16.0), tvos(17.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+@property(class, readonly, nullable) AVCaptureDevice *systemPreferredCamera API_AVAILABLE(macos(13.0), ios(17.0), macCatalyst(16.0), tvos(17.0), visionos(1.0)) API_UNAVAILABLE(watchos);
@end
@@ -1866,10 +1866,10 @@
AVAuthorizationStatusRestricted = 1,
AVAuthorizationStatusDenied = 2,
AVAuthorizationStatusAuthorized = 3,
-} API_AVAILABLE(macos(10.14), ios(7.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+} API_AVAILABLE(macos(10.14), ios(7.0), macCatalyst(14.0), tvos(17.0), visionos(1.0)) API_UNAVAILABLE(watchos);
-API_AVAILABLE(macos(10.7), ios(4.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos)
+API_AVAILABLE(macos(10.7), ios(4.0), macCatalyst(14.0), tvos(17.0), visionos(1.0)) API_UNAVAILABLE(watchos)
@interface AVCaptureDevice (AVCaptureDeviceAuthorization)
/*!
@@ -1885,7 +1885,7 @@
@discussion
This method returns the AVAuthorizationStatus of the client for accessing the underlying hardware supporting the media type. Media type constants are defined in AVMediaFormat.h. If any media type other than AVMediaTypeVideo or AVMediaTypeAudio is supplied, an NSInvalidArgumentException will be thrown. If the status is AVAuthorizationStatusNotDetermined, you may use the +requestAccessForMediaType:completionHandler: method to request access by prompting the user.
*/
-+ (AVAuthorizationStatus)authorizationStatusForMediaType:(AVMediaType)mediaType API_AVAILABLE(macos(10.14), ios(7.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(visionos);
++ (AVAuthorizationStatus)authorizationStatusForMediaType:(AVMediaType)mediaType API_AVAILABLE(macos(10.14), ios(7.0), macCatalyst(14.0), tvos(17.0), visionos(1.0));
/*!
@method requestAccessForMediaType:completionHandler:
@@ -1908,7 +1908,7 @@
The completion handler is called on an arbitrary dispatch queue. It is the client's responsibility to ensure that any UIKit-related updates are called on the main queue or main thread as a result.
*/
-+ (void)requestAccessForMediaType:(AVMediaType)mediaType completionHandler:(void (^)(BOOL granted))handler API_AVAILABLE(macos(10.14), ios(7.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(visionos);
++ (void)requestAccessForMediaType:(AVMediaType)mediaType completionHandler:(void (^)(BOOL granted))handler API_AVAILABLE(macos(10.14), ios(7.0), macCatalyst(14.0), tvos(17.0), visionos(1.0));
@end
@@ -2619,7 +2619,7 @@
@discussion
An AVCaptureDevice exposes an array of formats, and its current activeFormat may be queried. The payload for the formats property is an array of AVCaptureDeviceFormat objects and the activeFormat property payload is an AVCaptureDeviceFormat. AVCaptureDeviceFormat wraps a CMFormatDescription and expresses a range of valid video frame rates as an NSArray of AVFrameRateRange objects. AVFrameRateRange expresses min and max frame rate as a rate in frames per second and duration (CMTime). An AVFrameRateRange object is immutable. Its values do not change for the life of the object.
*/
-API_AVAILABLE(macos(10.7), ios(7.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos)
+API_AVAILABLE(macos(10.7), ios(7.0), macCatalyst(14.0), tvos(17.0), visionos(1.0)) API_UNAVAILABLE(watchos)
@interface AVFrameRateRange : NSObject
{
@private
@@ -2731,7 +2731,7 @@
@discussion
An AVCaptureDevice exposes an array of formats, and its current activeFormat may be queried. The payload for the formats property is an array of AVCaptureDeviceFormat objects and the activeFormat property payload is an AVCaptureDeviceFormat. AVCaptureDeviceFormat is a thin wrapper around a CMFormatDescription, and can carry associated device format information that doesn't go in a CMFormatDescription, such as min and max frame rate. An AVCaptureDeviceFormat object is immutable. Its values do not change for the life of the object.
*/
-API_AVAILABLE(macos(10.7), ios(7.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos)
+API_AVAILABLE(macos(10.7), ios(7.0), macCatalyst(14.0), tvos(17.0), visionos(1.0)) API_UNAVAILABLE(watchos)
@interface AVCaptureDeviceFormat : NSObject
{
@private
@@ -2778,7 +2778,7 @@
@discussion
videoFieldOfView is a float value indicating the receiver's field of view in degrees. If field of view is unknown, a value of 0 is returned.
*/
-@property(nonatomic, readonly) float videoFieldOfView API_UNAVAILABLE(macos);
+@property(nonatomic, readonly) float videoFieldOfView API_UNAVAILABLE(macos, visionos);
/*!
@property videoBinned
@@ -2788,7 +2788,7 @@
@discussion
videoBinned is a BOOL indicating whether the format is a binned format. Binning is a pixel-combining process which can result in greater low light sensitivity at the cost of reduced resolution.
*/
-@property(nonatomic, readonly, getter=isVideoBinned) BOOL videoBinned API_UNAVAILABLE(macos);
+@property(nonatomic, readonly, getter=isVideoBinned) BOOL videoBinned API_UNAVAILABLE(macos, visionos);
/*!
@method isVideoStabilizationModeSupported
@@ -2821,7 +2821,7 @@
@discussion
If the device's videoZoomFactor property is assigned a larger value, an NSRangeException will be thrown. A maximum zoom factor of 1 indicates no zoom is available.
*/
-@property(nonatomic, readonly) CGFloat videoMaxZoomFactor API_UNAVAILABLE(macos);
+@property(nonatomic, readonly) CGFloat videoMaxZoomFactor API_UNAVAILABLE(macos, visionos);
/*!
@property videoZoomFactorUpscaleThreshold
@@ -2831,7 +2831,7 @@
@discussion
In some cases the image sensor's dimensions are larger than the dimensions reported by the video AVCaptureDeviceFormat. As long as the sensor crop is larger than the reported dimensions of the AVCaptureDeviceFormat, the image will be downscaled. Setting videoZoomFactor to the value of videoZoomFactorUpscalingThreshold will provide a center crop of the sensor image data without any scaling. If a greater zoom factor is used, then the sensor data will be upscaled to the device format's dimensions.
*/
-@property(nonatomic, readonly) CGFloat videoZoomFactorUpscaleThreshold API_UNAVAILABLE(macos);
+@property(nonatomic, readonly) CGFloat videoZoomFactorUpscaleThreshold API_UNAVAILABLE(macos, visionos);
/*!
@property minExposureDuration
diff -ruN /Applications/Xcode_15.0.0.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureInput.h /Applications/Xcode_15.1.0-beta2.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureInput.h
--- /Applications/Xcode_15.0.0.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureInput.h 2023-08-20 22:56:21
+++ /Applications/Xcode_15.1.0-beta2.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureInput.h 2023-10-13 15:10:20
@@ -33,7 +33,7 @@
@discussion
Concrete instances of AVCaptureInput representing input sources such as cameras can be added to instances of AVCaptureSession using the -[AVCaptureSession addInput:] method. An AVCaptureInput vends one or more streams of media data. For example, input devices can provide both audio and video data. Each media stream provided by an input is represented by an AVCaptureInputPort object. Within a capture session, connections are made between AVCaptureInput instances and AVCaptureOutput instances via AVCaptureConnection objects that define the mapping between a set of AVCaptureInputPort objects and a single AVCaptureOutput.
*/
-API_AVAILABLE(macos(10.7), ios(4.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos)
+API_AVAILABLE(macos(10.7), ios(4.0), macCatalyst(14.0), tvos(17.0), visionos(1.0)) API_UNAVAILABLE(watchos)
@interface AVCaptureInput : NSObject
{
@private
@@ -50,7 +50,7 @@
@discussion
The value of this property is an array of AVCaptureInputPort objects, each exposing an interface to a single stream of media data provided by an input.
*/
-@property(nonatomic, readonly) NSArray<AVCaptureInputPort *> *ports;
+@property(nonatomic, readonly) NSArray<AVCaptureInputPort *> *ports API_UNAVAILABLE(visionos);
@end
@@ -177,7 +177,7 @@
@discussion
Instances of AVCaptureDeviceInput are input sources for AVCaptureSession that provide media data from devices connected to the system, represented by instances of AVCaptureDevice.
*/
-API_AVAILABLE(macos(10.7), ios(4.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos)
+API_AVAILABLE(macos(10.7), ios(4.0), macCatalyst(14.0), tvos(17.0), visionos(1.0)) API_UNAVAILABLE(watchos)
@interface AVCaptureDeviceInput : AVCaptureInput
{
@private
@@ -199,7 +199,7 @@
@discussion
This method returns an instance of AVCaptureDeviceInput that can be used to capture data from an AVCaptureDevice in an AVCaptureSession. This method attempts to open the device for capture, taking exclusive control of it if necessary. If the device cannot be opened because it is no longer available or because it is in use, for example, this method returns nil, and the optional outError parameter points to an NSError describing the problem.
*/
-+ (nullable instancetype)deviceInputWithDevice:(AVCaptureDevice *)device error:(NSError * _Nullable * _Nullable)outError;
++ (nullable instancetype)deviceInputWithDevice:(AVCaptureDevice *)device error:(NSError * _Nullable * _Nullable)outError API_UNAVAILABLE(visionos);
/*!
@method initWithDevice:error:
diff -ruN /Applications/Xcode_15.0.0.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureOutputBase.h /Applications/Xcode_15.1.0-beta2.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureOutputBase.h
--- /Applications/Xcode_15.0.0.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureOutputBase.h 2023-08-19 22:02:10
+++ /Applications/Xcode_15.1.0-beta2.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureOutputBase.h 2023-10-13 13:59:01
@@ -31,7 +31,7 @@
Concrete AVCaptureOutput instances can be added to an AVCaptureSession using the -[AVCaptureSession addOutput:] and -[AVCaptureSession addOutputWithNoConnections:] methods.
*/
-API_AVAILABLE(macos(10.7), ios(4.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos)
+API_AVAILABLE(macos(10.7), ios(4.0), macCatalyst(14.0), tvos(17.0), visionos(1.0)) API_UNAVAILABLE(watchos)
@interface AVCaptureOutput : NSObject
{
@private
@@ -48,7 +48,7 @@
@discussion
The value of this property is an NSArray of AVCaptureConnection objects, each describing the mapping between the receiver and the AVCaptureInputPorts of one or more AVCaptureInputs.
*/
-@property(nonatomic, readonly) NSArray<AVCaptureConnection *> *connections;
+@property(nonatomic, readonly) NSArray<AVCaptureConnection *> *connections API_UNAVAILABLE(visionos);
/*!
@method connectionWithMediaType:
@@ -61,7 +61,7 @@
@discussion
This convenience method returns the first AVCaptureConnection in the receiver's connections array that has an AVCaptureInputPort of the specified mediaType. If no connection with the specified mediaType is found, nil is returned.
*/
-- (nullable AVCaptureConnection *)connectionWithMediaType:(AVMediaType)mediaType API_AVAILABLE(ios(5.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(visionos);
+- (nullable AVCaptureConnection *)connectionWithMediaType:(AVMediaType)mediaType API_AVAILABLE(ios(5.0), macCatalyst(14.0), tvos(17.0), visionos(1.0));
/*!
@method transformedMetadataObjectForMetadataObject:connection:
@@ -136,7 +136,7 @@
AVCaptureOutputDataDroppedReasonLateData = 1,
AVCaptureOutputDataDroppedReasonOutOfBuffers = 2,
AVCaptureOutputDataDroppedReasonDiscontinuity = 3,
-} API_AVAILABLE(macos(10.15), ios(11.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+} API_AVAILABLE(macos(10.15), ios(11.0), macCatalyst(14.0), tvos(17.0), visionos(1.0)) API_UNAVAILABLE(watchos);
NS_ASSUME_NONNULL_END
diff -ruN /Applications/Xcode_15.0.0.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureSession.h /Applications/Xcode_15.1.0-beta2.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureSession.h
--- /Applications/Xcode_15.0.0.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureSession.h 2023-08-20 22:56:21
+++ /Applications/Xcode_15.1.0-beta2.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureSession.h 2023-09-22 23:54:20
@@ -23,7 +23,7 @@
@discussion
The notification object is the AVCaptureSession instance that encountered a runtime error. The userInfo dictionary contains an NSError for the key AVCaptureSessionErrorKey.
*/
-AVF_EXPORT NSString *const AVCaptureSessionRuntimeErrorNotification API_AVAILABLE(macos(10.7), ios(4.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+AVF_EXPORT NSString *const AVCaptureSessionRuntimeErrorNotification API_AVAILABLE(macos(10.7), ios(4.0), macCatalyst(14.0), tvos(17.0), visionos(1.0)) API_UNAVAILABLE(watchos);
/*!
@constant AVCaptureSessionErrorKey
@@ -33,7 +33,7 @@
@discussion
AVCaptureSessionErrorKey may be found in the userInfo dictionary provided with an AVCaptureSessionRuntimeErrorNotification. The NSError associated with the notification gives greater detail on the nature of the error, and in some cases recovery suggestions.
*/
-AVF_EXPORT NSString *const AVCaptureSessionErrorKey API_AVAILABLE(macos(10.7), ios(4.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+AVF_EXPORT NSString *const AVCaptureSessionErrorKey API_AVAILABLE(macos(10.7), ios(4.0), macCatalyst(14.0), tvos(17.0), visionos(1.0)) API_UNAVAILABLE(watchos);
/*!
@constant AVCaptureSessionDidStartRunningNotification
@@ -43,7 +43,7 @@
@discussion
Clients may observe the AVCaptureSessionDidStartRunningNotification to know when an instance of AVCaptureSession starts running.
*/
-AVF_EXPORT NSString *const AVCaptureSessionDidStartRunningNotification API_AVAILABLE(macos(10.7), ios(4.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+AVF_EXPORT NSString *const AVCaptureSessionDidStartRunningNotification API_AVAILABLE(macos(10.7), ios(4.0), macCatalyst(14.0), tvos(17.0), visionos(1.0)) API_UNAVAILABLE(watchos);
/*!
@constant AVCaptureSessionDidStopRunningNotification
@@ -53,7 +53,7 @@
@discussion
Clients may observe the AVCaptureSessionDidStopRunningNotification to know when an instance of AVCaptureSession stops running. An AVCaptureSession instance may stop running automatically due to external system conditions, such as the device going to sleep, or being locked by a user.
*/
-AVF_EXPORT NSString *const AVCaptureSessionDidStopRunningNotification API_AVAILABLE(macos(10.7), ios(4.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+AVF_EXPORT NSString *const AVCaptureSessionDidStopRunningNotification API_AVAILABLE(macos(10.7), ios(4.0), macCatalyst(14.0), tvos(17.0), visionos(1.0)) API_UNAVAILABLE(watchos);
/*!
@constant AVCaptureSessionWasInterruptedNotification
@@ -65,7 +65,7 @@
Beginning in iOS 9.0, the AVCaptureSessionWasInterruptedNotification userInfo dictionary contains an AVCaptureSessionInterruptionReasonKey indicating the reason for the interruption.
*/
-AVF_EXPORT NSString *const AVCaptureSessionWasInterruptedNotification API_AVAILABLE(macos(10.14), ios(4.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+AVF_EXPORT NSString *const AVCaptureSessionWasInterruptedNotification API_AVAILABLE(macos(10.14), ios(4.0), macCatalyst(14.0), tvos(17.0), visionos(1.0)) API_UNAVAILABLE(watchos);
/*!
@@ -89,8 +89,8 @@
AVCaptureSessionInterruptionReasonAudioDeviceInUseByAnotherClient = 2,
AVCaptureSessionInterruptionReasonVideoDeviceInUseByAnotherClient = 3,
AVCaptureSessionInterruptionReasonVideoDeviceNotAvailableWithMultipleForegroundApps = 4,
- AVCaptureSessionInterruptionReasonVideoDeviceNotAvailableDueToSystemPressure API_AVAILABLE(ios(11.1), macCatalyst(14.0)) API_UNAVAILABLE(visionos) = 5,
-} API_AVAILABLE(ios(9.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(macos, visionos) API_UNAVAILABLE(watchos);
+ AVCaptureSessionInterruptionReasonVideoDeviceNotAvailableDueToSystemPressure API_AVAILABLE(ios(11.1), macCatalyst(14.0), visionos(1.0)) = 5,
+} API_AVAILABLE(ios(9.0), macCatalyst(14.0), tvos(17.0), visionos(1.0)) API_UNAVAILABLE(macos) API_UNAVAILABLE(watchos);
/*!
@@ -101,7 +101,7 @@
@discussion
AVCaptureSessionInterruptionReasonKey may be found in the userInfo dictionary provided with an AVCaptureSessionWasInterruptedNotification. The NSNumber associated with the notification tells you why the interruption occurred.
*/
-AVF_EXPORT NSString *const AVCaptureSessionInterruptionReasonKey API_AVAILABLE(ios(9.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(macos, visionos) API_UNAVAILABLE(watchos);
+AVF_EXPORT NSString *const AVCaptureSessionInterruptionReasonKey API_AVAILABLE(ios(9.0), macCatalyst(14.0), tvos(17.0), visionos(1.0)) API_UNAVAILABLE(macos) API_UNAVAILABLE(watchos);
/*!
@constant AVCaptureSessionInterruptionSystemPressureStateKey
@@ -111,7 +111,7 @@
@discussion
This key is only present when the AVCaptureSessionInterruptionReasonKey equals AVCaptureSessionInterruptionReasonVideoDeviceNotAvailableDueToSystemPressure.
*/
-AVF_EXPORT NSString *const AVCaptureSessionInterruptionSystemPressureStateKey API_AVAILABLE(ios(11.1), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(macos, visionos) API_UNAVAILABLE(watchos);
+AVF_EXPORT NSString *const AVCaptureSessionInterruptionSystemPressureStateKey API_AVAILABLE(ios(11.1), macCatalyst(14.0), tvos(17.0), visionos(1.0)) API_UNAVAILABLE(macos) API_UNAVAILABLE(watchos);
/*!
@constant AVCaptureSessionInterruptionEndedNotification
@@ -121,7 +121,7 @@
@discussion
Clients may observe the AVCaptureSessionInterruptionEndedNotification to know when an instance of AVCaptureSession ceases to be interrupted, for example, when a phone call ends, and hardware resources needed to run the session are again available. When appropriate, the AVCaptureSession instance that was previously stopped in response to an interruption will automatically restart once the interruption ends.
*/
-AVF_EXPORT NSString *const AVCaptureSessionInterruptionEndedNotification API_AVAILABLE(macos(10.14), ios(4.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+AVF_EXPORT NSString *const AVCaptureSessionInterruptionEndedNotification API_AVAILABLE(macos(10.14), ios(4.0), macCatalyst(14.0), tvos(17.0), visionos(1.0)) API_UNAVAILABLE(watchos);
/*!
@enum AVCaptureVideoOrientation
@@ -159,7 +159,7 @@
@discussion
To perform a real-time capture, a client may instantiate AVCaptureSession and add appropriate AVCaptureInputs, such as AVCaptureDeviceInput, and outputs, such as AVCaptureMovieFileOutput. [AVCaptureSession startRunning] starts the flow of data from the inputs to the outputs, and [AVCaptureSession stopRunning] stops the flow. A client may set the sessionPreset property to customize the quality level or bitrate of the output.
*/
-API_AVAILABLE(macos(10.7), ios(4.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos)
+API_AVAILABLE(macos(10.7), ios(4.0), macCatalyst(14.0), tvos(17.0), visionos(1.0)) API_UNAVAILABLE(watchos)
@interface AVCaptureSession : NSObject
{
@private
@@ -179,7 +179,7 @@
@discussion
An AVCaptureSession instance can be associated with a preset that configures its inputs and outputs to fulfill common use cases. This method can be used to determine if the receiver supports the desired preset given its current input and output configuration. The receiver's sessionPreset property may only be set to a certain preset if this method returns YES for that preset.
*/
-- (BOOL)canSetSessionPreset:(AVCaptureSessionPreset)preset;
+- (BOOL)canSetSessionPreset:(AVCaptureSessionPreset)preset API_UNAVAILABLE(visionos);
/*!
@property sessionPreset
@@ -189,7 +189,7 @@
@discussion
The value of this property is an AVCaptureSessionPreset indicating the current session preset in use by the receiver. The sessionPreset property may be set while the receiver is running.
*/
-@property(nonatomic, copy) AVCaptureSessionPreset sessionPreset;
+@property(nonatomic, copy) AVCaptureSessionPreset sessionPreset API_UNAVAILABLE(visionos);
/*!
@property inputs
@@ -333,7 +333,7 @@
@discussion
The value of this property is an NSArray of AVCaptureConnections currently added to the receiver. Connections are formed implicitly by the receiver when a client calls -addInput: or -addOutput:. Connections are formed explicitly when a client calls -addConnection:.
*/
-@property(nonatomic, readonly) NSArray<AVCaptureConnection *> *connections API_AVAILABLE(macos(10.15), ios(13.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos);
+@property(nonatomic, readonly) NSArray<AVCaptureConnection *> *connections API_AVAILABLE(macos(10.15), ios(13.0), macCatalyst(14.0), tvos(17.0), visionos(1.0)) API_UNAVAILABLE(watchos);
/*!
@method canAddConnection:
@@ -412,7 +412,7 @@
@discussion
The value of this property is a BOOL indicating whether the receiver is currently being interrupted, such as by a phone call or alarm. Clients can key value observe the value of this property to be notified when the session ceases to be interrupted and again has access to needed hardware resources.
*/
-@property(nonatomic, readonly, getter=isInterrupted) BOOL interrupted API_AVAILABLE(ios(4.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(macos, visionos);
+@property(nonatomic, readonly, getter=isInterrupted) BOOL interrupted API_AVAILABLE(ios(4.0), macCatalyst(14.0), tvos(17.0), visionos(1.0)) API_UNAVAILABLE(macos);
/*!
@property multitaskingCameraAccessSupported
@@ -515,7 +515,7 @@
This property is key-value observable.
*/
-@property(nonatomic, readonly, nullable) __attribute__((NSObject)) CMClockRef synchronizationClock API_AVAILABLE(macos(12.3), ios(15.4), macCatalyst(15.4), tvos(17.0)) API_UNAVAILABLE(visionos);
+@property(nonatomic, readonly, nullable) __attribute__((NSObject)) CMClockRef synchronizationClock API_AVAILABLE(macos(12.3), ios(15.4), macCatalyst(15.4), tvos(17.0), visionos(1.0));
/*!
@property masterClock
@@ -645,7 +645,7 @@
Connections involving video expose video specific properties, such as videoMirrored and videoRotationAngle.
*/
-API_AVAILABLE(macos(10.7), ios(4.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos)
+API_AVAILABLE(macos(10.7), ios(4.0), macCatalyst(14.0), tvos(17.0), visionos(1.0)) API_UNAVAILABLE(watchos)
@interface AVCaptureConnection : NSObject
{
@private
@@ -731,7 +731,7 @@
@discussion
An AVCaptureConnection may involve one or more AVCaptureInputPorts producing data to the connection's AVCaptureOutput. This property is read-only. An AVCaptureConnection's inputPorts remain static for the life of the object.
*/
-@property(nonatomic, readonly) NSArray<AVCaptureInputPort *> *inputPorts;
+@property(nonatomic, readonly) NSArray<AVCaptureInputPort *> *inputPorts API_UNAVAILABLE(visionos);
/*!
@property output
@@ -741,7 +741,7 @@
@discussion
An AVCaptureConnection may involve one or more AVCaptureInputPorts producing data to the connection's AVCaptureOutput. This property is read-only. An AVCaptureConnection's output remains static for the life of the object. Note that a connection can either be to an output or a video preview layer, but never to both.
*/
-@property(nonatomic, readonly, nullable) AVCaptureOutput *output;
+@property(nonatomic, readonly, nullable) AVCaptureOutput *output API_UNAVAILABLE(visionos);
/*!
@property videoPreviewLayer
@@ -761,7 +761,7 @@
@discussion
The value of this property is a BOOL that determines whether the receiver's output should consume data from its connected inputPorts when a session is running. Clients can set this property to stop the flow of data to a given output during capture. The default value is YES.
*/
-@property(nonatomic, getter=isEnabled) BOOL enabled;
+@property(nonatomic, getter=isEnabled) BOOL enabled API_UNAVAILABLE(visionos);
/*!
@property active
@@ -773,7 +773,7 @@
Prior to iOS 11, the audio connection feeding an AVCaptureAudioDataOutput is made inactive when using AVCaptureSessionPresetPhoto or the equivalent photo format using -[AVCaptureDevice activeFormat]. On iOS 11 and later, the audio connection feeding AVCaptureAudioDataOutput is active for all presets and device formats.
*/
-@property(nonatomic, readonly, getter=isActive) BOOL active;
+@property(nonatomic, readonly, getter=isActive) BOOL active API_UNAVAILABLE(visionos);
/*!
@property audioChannels
@@ -783,7 +783,7 @@
@discussion
This property is only applicable to AVCaptureConnection instances involving audio. In such connections, the audioChannels array contains one AVCaptureAudioChannel object for each channel of audio data flowing through this connection.
*/
-@property(nonatomic, readonly) NSArray<AVCaptureAudioChannel *> *audioChannels;
+@property(nonatomic, readonly) NSArray<AVCaptureAudioChannel *> *audioChannels API_UNAVAILABLE(visionos);
/*!
@property supportsVideoMirroring
@@ -829,7 +829,7 @@
@discussion
The connection's videoRotationAngle property can only be set to a certain angle if this method returns YES for that angle. Only rotation angles of 0, 90, 180 and 270 are supported.
*/
-- (BOOL)isVideoRotationAngleSupported:(CGFloat)videoRotationAngle;
+- (BOOL)isVideoRotationAngleSupported:(CGFloat)videoRotationAngle API_UNAVAILABLE(visionos);
/*!
@property videoRotationAngle
@@ -891,7 +891,7 @@
This property is deprecated on iOS, where min and max frame rate adjustments are applied exclusively at the AVCaptureDevice using the activeVideoMinFrameDuration and activeVideoMaxFrameDuration properties. On macOS, frame rate adjustments are supported both at the AVCaptureDevice and at AVCaptureConnection, enabling connections to output different frame rates.
*/
-@property(nonatomic, readonly, getter=isVideoMinFrameDurationSupported) BOOL supportsVideoMinFrameDuration API_DEPRECATED("Use AVCaptureDevice's activeFormat.videoSupportedFrameRateRanges instead.", ios(5.0, 7.0), macCatalyst(14.0, 14.0), visionos(1.0, 1.0)) API_UNAVAILABLE(tvos);
+@property(nonatomic, readonly, getter=isVideoMinFrameDurationSupported) BOOL supportsVideoMinFrameDuration API_DEPRECATED("Use AVCaptureDevice's activeFormat.videoSupportedFrameRateRanges instead.", ios(5.0, 7.0), macCatalyst(14.0, 14.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(tvos);
/*!
@property videoMinFrameDuration
@@ -903,7 +903,7 @@
This property is deprecated on iOS, where min and max frame rate adjustments are applied exclusively at the AVCaptureDevice using the activeVideoMinFrameDuration and activeVideoMaxFrameDuration properties. On macOS, frame rate adjustments are supported both at the AVCaptureDevice and at AVCaptureConnection, enabling connections to output different frame rates.
*/
-@property(nonatomic) CMTime videoMinFrameDuration API_DEPRECATED("Use AVCaptureDevice's activeVideoMinFrameDuration instead.", ios(5.0, 7.0), macCatalyst(14.0, 14.0), visionos(1.0, 1.0)) API_UNAVAILABLE(tvos);
+@property(nonatomic) CMTime videoMinFrameDuration API_DEPRECATED("Use AVCaptureDevice's activeVideoMinFrameDuration instead.", ios(5.0, 7.0), macCatalyst(14.0, 14.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(tvos);
/*!
@property supportsVideoMaxFrameDuration
@@ -915,7 +915,7 @@
This property is deprecated on iOS, where min and max frame rate adjustments are applied exclusively at the AVCaptureDevice using the activeVideoMinFrameDuration and activeVideoMaxFrameDuration properties. On macOS, frame rate adjustments are supported both at the AVCaptureDevice and at AVCaptureConnection, enabling connections to output different frame rates.
*/
-@property(nonatomic, readonly, getter=isVideoMaxFrameDurationSupported) BOOL supportsVideoMaxFrameDuration API_AVAILABLE(macos(10.9)) API_DEPRECATED("Use AVCaptureDevice's activeFormat.videoSupportedFrameRateRanges instead.", ios(5.0, 7.0), macCatalyst(14.0, 14.0), visionos(1.0, 1.0)) API_UNAVAILABLE(tvos);
+@property(nonatomic, readonly, getter=isVideoMaxFrameDurationSupported) BOOL supportsVideoMaxFrameDuration API_AVAILABLE(macos(10.9)) API_DEPRECATED("Use AVCaptureDevice's activeFormat.videoSupportedFrameRateRanges instead.", ios(5.0, 7.0), macCatalyst(14.0, 14.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(tvos);
/*!
@property videoMaxFrameDuration
diff -ruN /Applications/Xcode_15.0.0.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureSystemPressure.h /Applications/Xcode_15.1.0-beta2.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureSystemPressure.h
--- /Applications/Xcode_15.0.0.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureSystemPressure.h 2023-08-20 23:07:01
+++ /Applications/Xcode_15.1.0-beta2.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureSystemPressure.h 2023-10-14 02:43:29
@@ -18,37 +18,37 @@
@discussion
The AVCaptureSystemPressureLevel string constants describe varying levels of system pressure that affect capture hardware availability and/or quality.
*/
-typedef NSString *AVCaptureSystemPressureLevel NS_TYPED_ENUM API_AVAILABLE(ios(11.1), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(macos, visionos) API_UNAVAILABLE(watchos);
+typedef NSString *AVCaptureSystemPressureLevel NS_TYPED_ENUM API_AVAILABLE(ios(11.1), macCatalyst(14.0), tvos(17.0), visionos(1.0)) API_UNAVAILABLE(macos) API_UNAVAILABLE(watchos);
/*!
@constant AVCaptureSystemPressureLevelNominal
System pressure level is normal (not pressured).
*/
-AVF_EXPORT AVCaptureSystemPressureLevel const AVCaptureSystemPressureLevelNominal API_AVAILABLE(ios(11.1), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(macos, visionos) API_UNAVAILABLE(watchos);
+AVF_EXPORT AVCaptureSystemPressureLevel const AVCaptureSystemPressureLevelNominal API_AVAILABLE(ios(11.1), macCatalyst(14.0), tvos(17.0), visionos(1.0)) API_UNAVAILABLE(macos) API_UNAVAILABLE(watchos);
/*!
@constant AVCaptureSystemPressureLevelFair
System pressure is slightly elevated.
*/
-AVF_EXPORT AVCaptureSystemPressureLevel const AVCaptureSystemPressureLevelFair API_AVAILABLE(ios(11.1), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(macos, visionos) API_UNAVAILABLE(watchos);
+AVF_EXPORT AVCaptureSystemPressureLevel const AVCaptureSystemPressureLevelFair API_AVAILABLE(ios(11.1), macCatalyst(14.0), tvos(17.0), visionos(1.0)) API_UNAVAILABLE(macos) API_UNAVAILABLE(watchos);
/*!
@constant AVCaptureSystemPressureLevelSerious
System pressure is highly elevated. Capture performance may be impacted. Frame rate throttling is advised.
*/
-AVF_EXPORT AVCaptureSystemPressureLevel const AVCaptureSystemPressureLevelSerious API_AVAILABLE(ios(11.1), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(macos, visionos) API_UNAVAILABLE(watchos);
+AVF_EXPORT AVCaptureSystemPressureLevel const AVCaptureSystemPressureLevelSerious API_AVAILABLE(ios(11.1), macCatalyst(14.0), tvos(17.0), visionos(1.0)) API_UNAVAILABLE(macos) API_UNAVAILABLE(watchos);
/*!
@constant AVCaptureSystemPressureLevelCritical
System pressure is critically elevated. Capture quality and performance are significantly impacted. Frame rate throttling is highly advised.
*/
-AVF_EXPORT AVCaptureSystemPressureLevel const AVCaptureSystemPressureLevelCritical API_AVAILABLE(ios(11.1), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(macos, visionos) API_UNAVAILABLE(watchos);
+AVF_EXPORT AVCaptureSystemPressureLevel const AVCaptureSystemPressureLevelCritical API_AVAILABLE(ios(11.1), macCatalyst(14.0), tvos(17.0), visionos(1.0)) API_UNAVAILABLE(macos) API_UNAVAILABLE(watchos);
/*!
@constant AVCaptureSystemPressureLevelShutdown
System pressure is beyond critical. Capture must immediately stop.
*/
-AVF_EXPORT AVCaptureSystemPressureLevel const AVCaptureSystemPressureLevelShutdown API_AVAILABLE(ios(11.1), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(macos, visionos) API_UNAVAILABLE(watchos);
+AVF_EXPORT AVCaptureSystemPressureLevel const AVCaptureSystemPressureLevelShutdown API_AVAILABLE(ios(11.1), macCatalyst(14.0), tvos(17.0), visionos(1.0)) API_UNAVAILABLE(macos) API_UNAVAILABLE(watchos);
/*!
@enum AVCaptureSystemPressureFactors
@@ -72,7 +72,7 @@
AVCaptureSystemPressureFactorPeakPower = (1UL << 1),
AVCaptureSystemPressureFactorDepthModuleTemperature = (1UL << 2),
AVCaptureSystemPressureFactorCameraTemperature API_AVAILABLE(ios(17.0), macCatalyst(17.0), tvos(17.0)) API_UNAVAILABLE(macos) API_UNAVAILABLE(watchos) = (1UL << 3),
-} API_AVAILABLE(ios(11.1), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(macos, visionos) API_UNAVAILABLE(watchos);
+} API_AVAILABLE(ios(11.1), macCatalyst(14.0), tvos(17.0), visionos(1.0)) API_UNAVAILABLE(macos) API_UNAVAILABLE(watchos);
@class AVCaptureSystemPressureStateInternal;
@@ -84,7 +84,7 @@
@discussion
Beginning in iOS 11.1, AVCaptureDevice can report its current system pressure state. System pressure refers to a state in which capture quality is degraded or capture hardware availability is limited due to factors such as overall system temperature, insufficient battery charge for current peak power requirements, or camera module temperature.
*/
-API_AVAILABLE(ios(11.1), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(macos, visionos) API_UNAVAILABLE(watchos)
+API_AVAILABLE(ios(11.1), macCatalyst(14.0), tvos(17.0), visionos(1.0)) API_UNAVAILABLE(macos) API_UNAVAILABLE(watchos)
@interface AVCaptureSystemPressureState : NSObject
{
@private
diff -ruN /Applications/Xcode_15.0.0.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureVideoDataOutput.h /Applications/Xcode_15.1.0-beta2.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureVideoDataOutput.h
--- /Applications/Xcode_15.0.0.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureVideoDataOutput.h 2023-08-19 21:19:42
+++ /Applications/Xcode_15.1.0-beta2.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureVideoDataOutput.h 2023-09-22 23:54:20
@@ -26,7 +26,7 @@
@discussion
Instances of AVCaptureVideoDataOutput produce video frames suitable for processing using other media APIs. Applications can access the frames with the captureOutput:didOutputSampleBuffer:fromConnection: delegate method.
*/
-API_AVAILABLE(macos(10.7), ios(4.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos)
+API_AVAILABLE(macos(10.7), ios(4.0), macCatalyst(14.0), tvos(17.0), visionos(1.0)) API_UNAVAILABLE(watchos)
@interface AVCaptureVideoDataOutput : AVCaptureOutput
{
@private
@@ -190,7 +190,7 @@
@discussion
The value of this property is an NSArray of NSNumbers that can be used as values for the kCVPixelBufferPixelFormatTypeKey in the receiver's videoSettings property. The formats are listed in an unspecified order. This list can may change if the activeFormat of the AVCaptureDevice connected to the receiver changes.
*/
-@property(nonatomic, readonly) NSArray<NSNumber *> *availableVideoCVPixelFormatTypes API_AVAILABLE(ios(5.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(visionos);
+@property(nonatomic, readonly) NSArray<NSNumber *> *availableVideoCVPixelFormatTypes API_AVAILABLE(ios(5.0), macCatalyst(14.0), tvos(17.0), visionos(1.0));
/*!
@property availableVideoCodecTypes
@@ -250,7 +250,7 @@
@abstract
Defines an interface for delegates of AVCaptureVideoDataOutput to receive captured video sample buffers and be notified of late sample buffers that were dropped.
*/
-API_AVAILABLE(macos(10.7), ios(4.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(visionos) API_UNAVAILABLE(watchos)
+API_AVAILABLE(macos(10.7), ios(4.0), macCatalyst(14.0), tvos(17.0), visionos(1.0)) API_UNAVAILABLE(watchos)
@protocol AVCaptureVideoDataOutputSampleBufferDelegate <NSObject>
@optional
@@ -291,7 +291,7 @@
@discussion
Delegates receive this message whenever a video frame is dropped. This method is called once for each dropped frame. The CMSampleBuffer object passed to this delegate method will contain metadata about the dropped video frame, such as its duration and presentation time stamp, but will contain no actual video data. On iOS, Included in the sample buffer attachments is the kCMSampleBufferAttachmentKey_DroppedFrameReason, which indicates why the frame was dropped. This method will be called on the dispatch queue specified by the output's sampleBufferCallbackQueue property. Because this method will be called on the same dispatch queue that is responsible for outputting video frames, it must be efficient to prevent further capture performance problems, such as additional dropped video frames.
*/
-- (void)captureOutput:(AVCaptureOutput *)output didDropSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection API_AVAILABLE(ios(6.0), macCatalyst(14.0), tvos(17.0)) API_UNAVAILABLE(visionos);
+- (void)captureOutput:(AVCaptureOutput *)output didDropSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection API_AVAILABLE(ios(6.0), macCatalyst(14.0), tvos(17.0), visionos(1.0));
@end
diff -ruN /Applications/Xcode_15.0.0.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVComposition.h /Applications/Xcode_15.1.0-beta2.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVComposition.h
--- /Applications/Xcode_15.0.0.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVComposition.h 2023-08-16 00:21:35
+++ /Applications/Xcode_15.1.0-beta2.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVComposition.h 2023-09-22 23:42:42
@@ -204,7 +204,7 @@
@param timeRange
Specifies the timeRange of the asset to be inserted.
@param asset
- Specifies the asset that contains the tracks that are to be inserted. Only instances of AVURLAsset and AVComposition are supported (AVComposition starting in MacOS X 10.10 and iOS 8.0).
+ Specifies the asset that contains the tracks that are to be inserted. Only instances of AVURLAsset and AVComposition are supported (AVComposition starting in macOS 10.10 and iOS 8.0).
@param startTime
Specifies the time at which the inserted tracks are to be presented by the composition.
@param outError
@@ -225,7 +225,7 @@
@param timeRange
Specifies the timeRange of the asset to be inserted.
@param asset
- Specifies the asset that contains the tracks that are to be inserted. Only instances of AVURLAsset and AVComposition are supported (AVComposition starting in MacOS X 10.10 and iOS 8.0).
+ Specifies the asset that contains the tracks that are to be inserted. Only instances of AVURLAsset and AVComposition are supported (AVComposition starting in macOS 10.10 and iOS 8.0).
@param startTime
Specifies the time at which the inserted tracks are to be presented by the composition.
@param completionHandler
diff -ruN /Applications/Xcode_15.0.0.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCompositionTrack.h /Applications/Xcode_15.1.0-beta2.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCompositionTrack.h
--- /Applications/Xcode_15.0.0.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCompositionTrack.h 2023-08-19 22:23:08
+++ /Applications/Xcode_15.1.0-beta2.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCompositionTrack.h 2023-10-17 23:59:54
@@ -142,7 +142,7 @@
@param timeRange
Specifies the timeRange of the track to be inserted.
@param track
- Specifies the source track to be inserted. Only AVAssetTracks of AVURLAssets and AVCompositions are supported (AVCompositions starting in MacOS X 10.10 and iOS 8.0).
+ Specifies the source track to be inserted. Only AVAssetTracks of AVURLAssets and AVCompositions are supported (AVCompositions starting in macOS 10.10 and iOS 8.0).
@param startTime
Specifies the time at which the inserted track is to be presented by the composition track. You may pass kCMTimeInvalid for startTime to indicate that the timeRange should be appended to the end of the track.
@param error
@@ -161,7 +161,7 @@
@param timeRanges
Specifies the timeRanges to be inserted. An NSArray of NSValues containing CMTimeRange. (See +[NSValue valueWithCMTimeRange:] in AVTime.h.)
@param tracks
- Specifies the source tracks to be inserted. Only AVAssetTracks of AVURLAssets and AVCompositions are supported (AVCompositions starting in MacOS X 10.10 and iOS 8.0).
+ Specifies the source tracks to be inserted. Only AVAssetTracks of AVURLAssets and AVCompositions are supported (AVCompositions starting in macOS 10.10 and iOS 8.0).
@param startTime
Specifies the time at which the inserted tracks are to be presented by the composition track. You may pass kCMTimeInvalid for startTime to indicate that the timeRanges should be appended to the end of the track.
@param error
diff -ruN /Applications/Xcode_15.0.0.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVExternalStorageDevice.h /Applications/Xcode_15.1.0-beta2.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVExternalStorageDevice.h
--- /Applications/Xcode_15.0.0.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVExternalStorageDevice.h 2023-08-09 02:57:54
+++ /Applications/Xcode_15.1.0-beta2.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVExternalStorageDevice.h 2023-10-16 05:33:15
@@ -173,9 +173,9 @@
Returns the singleton instance of the external storage device discovery session.
@discussion
- There is only one external storage device discovery session for each host device which can be accessed using this method.
+ There is only one external storage device discovery session for each host device which can be accessed using this method. Will return nil if the device doesn't support external storage devices.
*/
-@property(class, readonly) AVExternalStorageDeviceDiscoverySession *sharedSession;
+@property(class, readonly, nullable) AVExternalStorageDeviceDiscoverySession *sharedSession;
/*!
@property externalStorageDevices
diff -ruN /Applications/Xcode_15.0.0.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVFCore.h /Applications/Xcode_15.1.0-beta2.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVFCore.h
--- /Applications/Xcode_15.0.0.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVFCore.h 2023-08-19 22:23:05
+++ /Applications/Xcode_15.1.0-beta2.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVFCore.h 2023-10-18 00:40:08
@@ -57,6 +57,7 @@
#import <AVFoundation/AVOutputSettingsAssistant.h>
#import <AVFoundation/AVPlaybackCoordinator.h>
#import <AVFoundation/AVPlayer.h>
+#import <AVFoundation/AVPlayerOutput.h>
#import <AVFoundation/AVPlayerItem.h>
#import <AVFoundation/AVPlayerItemMediaDataCollector.h>
#import <AVFoundation/AVPlayerItemOutput.h>
diff -ruN /Applications/Xcode_15.0.0.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVFoundation.apinotes /Applications/Xcode_15.1.0-beta2.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVFoundation.apinotes
--- /Applications/Xcode_15.0.0.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVFoundation.apinotes 2023-08-19 21:37:46
+++ /Applications/Xcode_15.1.0-beta2.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVFoundation.apinotes 2023-10-17 02:39:34
@@ -634,7 +634,7 @@
Methods:
- Selector: 'coordinateRateChangeToRate:options:'
SwiftName: coordinateRateChange(to:options:)
- MethodKind: Instance
+ MethodKind: Instance
Protocols:
- Name: AVCaptureAudioDataOutputSampleBufferDelegate
Methods:
diff -ruN /Applications/Xcode_15.0.0.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVPlayer.h /Applications/Xcode_15.1.0-beta2.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVPlayer.h
--- /Applications/Xcode_15.0.0.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVPlayer.h 2023-08-19 22:14:54
+++ /Applications/Xcode_15.1.0-beta2.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVPlayer.h 2023-10-17 02:08:11
@@ -37,6 +37,7 @@
@class AVPlayerItem;
@class AVPlayerInternal;
@class AVPlayerPlaybackCoordinator;
+@class AVPlayerVideoOutput;
NS_ASSUME_NONNULL_BEGIN
@@ -469,8 +470,8 @@
Changing the value of this property to NO while the value of timeControlStatus is AVPlayerTimeControlStatusWaitingToPlayAtSpecifiedRate with a reasonForWaitingToPlay of AVPlayerWaitingToMinimizeStallsReason will cause the player to attempt playback at the specified rate immediately.
- For clients linked against iOS 10.0 and running on that version or later or linked against OS X 10.12 and running on that version or later, the default value of this property is YES.
- In versions of iOS prior to iOS 10.0 and versions of OS X prior to 10.12, this property is unavailable, and the behavior of the AVPlayer corresponds to the type of content being played. For streaming content, including HTTP Live Streaming, the AVPlayer acts as if automaticallyWaitsToMinimizeStalling is YES. For file-based content, including file-based content accessed via progressive http download, the AVPlayer acts as if automaticallyWaitsToMinimizeStalling is NO.
+ For clients linked against iOS 10.0 and running on that version or later or linked against macOS 10.12 and running on that version or later, the default value of this property is YES.
+ In versions of iOS prior to iOS 10.0 and versions of macOS prior to 10.12, this property is unavailable, and the behavior of the AVPlayer corresponds to the type of content being played. For streaming content, including HTTP Live Streaming, the AVPlayer acts as if automaticallyWaitsToMinimizeStalling is YES. For file-based content, including file-based content accessed via progressive http download, the AVPlayer acts as if automaticallyWaitsToMinimizeStalling is NO.
If you employ an AVAssetResourceLoader delegate that loads media data for playback, you should set the value of your AVPlayer’s automaticallyWaitsToMinimizeStalling property to NO. Allowing the value of automaticallyWaitsToMinimizeStalling to remain YES when an AVAssetResourceLoader delegate is used for the loading of media data can result in poor start-up times for playback and poor recovery from stalls, because the behaviors provided by AVPlayer when automaticallyWaitsToMinimizeStalling has a value of YES depend on predictions of the future availability of media data that that do not function as expected when data is loaded via a client-controlled means, using the AVAssetResourceLoader delegate interface.
@@ -495,7 +496,7 @@
The current item's timebase is adjusted so that its time will be (or was) itemTime when host time is (or was) hostClockTime.
In other words: if hostClockTime is in the past, the timebase's time will be interpolated as though the timebase has been running at the requested rate since that time. If hostClockTime is in the future, the timebase will immediately start running at the requested rate from an earlier time so that it will reach the requested itemTime at the requested hostClockTime. (Note that the item's time will not jump backwards, but instead will sit at itemTime until the timebase reaches that time.)
- Note that setRate:time:atHostTime: is not supported when automaticallyWaitsToMinimizeStalling is YES. For clients linked against iOS 10.0 and later or OS X 12.0 and later, invoking setRate:time:atHostTime: when automaticallyWaitsToMinimizeStalling is YES will raise an NSInvalidArgument exception. Support for HTTP Live Streaming content requires iOS 11, tvOS 11, macOS 10.13 or later.
+ Note that setRate:time:atHostTime: is not supported when automaticallyWaitsToMinimizeStalling is YES. For clients linked against iOS 10.0 and later or macOS 12.0 and later, invoking setRate:time:atHostTime: when automaticallyWaitsToMinimizeStalling is YES will raise an NSInvalidArgument exception. Support for HTTP Live Streaming content requires iOS 11, tvOS 11, macOS 10.13 or later.
Before macOS 13, iOS 16, tvOS 16, and watchOS 9, this method must be invoked on the main thread/queue.
@@ -627,7 +628,7 @@
@interface AVPlayer (AVPlayerAutomaticMediaSelection)
/* Indicates whether the receiver should apply the current selection criteria automatically to AVPlayerItems.
- For clients linked against the iOS 7 SDK or later or against the OS X 10.9 SDK or later, the default is YES. For all others, the default is NO.
+ For clients linked against the iOS 7 SDK or later or against the macOS 10.9 SDK or later, the default is YES. For all others, the default is NO.
By default, AVPlayer applies selection criteria based on system preferences. To override the default criteria for any media selection group, use -[AVPlayer setMediaSelectionCriteria:forMediaCharacteristic:].
*/
@@ -927,6 +928,20 @@
It is left to the owner of the AVPlayer to ensure that all participants are playing the same item. See the discussion of AVPlaybackCoordinator for considerations about item transitions.
*/
@property (readonly, strong) AVPlayerPlaybackCoordinator *playbackCoordinator API_AVAILABLE(macos(12.0), ios(15.0), tvos(15.0)) API_UNAVAILABLE(watchos);
+
+@end
+
+API_AVAILABLE(macos(14.2), ios(17.2), tvos(17.2), watchos(10.2), visionos(1.1))
+@interface AVPlayer (AVPlayerOutputSupport)
+
+/*!
+ @property videoOutput
+ @abstract The video output for this player, if one was set.
+ @discussion When an AVPlayerVideoOutput is associated with an AVPlayer, the AVPlayerVideoOutput can then be used to receive video-related samples during playback.
+ @note If an output is set while AVPlayer has a current item it may cause different data channels to be selected for that item, which can have a performance impact.
+ As a result, when possible, it is best to set an output before setting items on an AVPlayer.
+ */
+@property (nonatomic, nullable, readwrite) AVPlayerVideoOutput *videoOutput;
@end
diff -ruN /Applications/Xcode_15.0.0.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVPlayerItem.h /Applications/Xcode_15.1.0-beta2.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVPlayerItem.h
--- /Applications/Xcode_15.0.0.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVPlayerItem.h 2023-08-18 23:48:44
+++ /Applications/Xcode_15.1.0-beta2.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVPlayerItem.h 2023-10-17 02:00:32
@@ -198,6 +198,7 @@
@class AVPlayerItemTrack;
@class AVMetadataItem;
+
@interface AVPlayerItem (AVPlayerItemInspection)
/*!
@@ -273,8 +274,8 @@
@interface AVPlayerItem (AVPlayerItemRateAndSteppingSupport)
-/* For releases of OS X prior to 10.9 and releases of iOS prior to 7.0, indicates whether the item can be played at rates greater than 1.0.
- Starting with OS X 10.9 and iOS 7.0, all AVPlayerItems with status AVPlayerItemReadyToPlay can be played at rates between 1.0 and 2.0, inclusive, even if canPlayFastForward is NO; for those releases canPlayFastForward indicates whether the item can be played at rates greater than 2.0.
+/* For releases of macOS prior to 10.9 and releases of iOS prior to 7.0, indicates whether the item can be played at rates greater than 1.0.
+ Starting with macOS 10.9 and iOS 7.0, all AVPlayerItems with status AVPlayerItemReadyToPlay can be played at rates between 1.0 and 2.0, inclusive, even if canPlayFastForward is NO; for those releases canPlayFastForward indicates whether the item can be played at rates greater than 2.0.
*/
@property (readonly) BOOL canPlayFastForward API_AVAILABLE(macos(10.8), ios(5.0), tvos(9.0), watchos(1.0));
@@ -625,7 +626,7 @@
@discussion
For live streaming content, the player item may need to use extra networking and power resources to keep playback state up to date when paused. For example, when this property is set to YES, the seekableTimeRanges property will be periodically updated to reflect the current state of the live stream.
- For clients linked on or after OS X 10.11 or iOS 9.0, the default value is NO. To minimize power usage, avoid setting this property to YES when you do not need playback state to stay up to date while paused.
+ For clients linked on or after macOS 10.11 or iOS 9.0, the default value is NO. To minimize power usage, avoid setting this property to YES when you do not need playback state to stay up to date while paused.
*/
@property (assign) BOOL canUseNetworkResourcesForLiveStreamingWhilePaused API_AVAILABLE(macos(10.11), ios(9.0), tvos(9.0), watchos(2.0));
diff -ruN /Applications/Xcode_15.0.0.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVPlayerOutput.h /Applications/Xcode_15.1.0-beta2.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVPlayerOutput.h
--- /Applications/Xcode_15.0.0.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVPlayerOutput.h 1969-12-31 19:00:00
+++ /Applications/Xcode_15.1.0-beta2.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVPlayerOutput.h 2023-10-17 02:08:11
@@ -0,0 +1,178 @@
+#if !__has_include(<AVFCore/AVPlayerOutput.h>)
+/*
+ File: AVPlayerOutput.h
+
+ Framework: AVFoundation
+
+ Copyright 2023 Apple Inc. All rights reserved.
+ */
+
+#import <AVFoundation/AVBase.h>
+#import <CoreMedia/CMTaggedBufferGroup.h>
+#import <CoreMedia/CMTagCollection.h>
+
+@class AVVideoOutputSpecification;
+@class AVPlayerVideoOutputConfiguration;
+@class AVPlayerItem;
+
+NS_ASSUME_NONNULL_BEGIN
+
+/*!
+ @class AVPlayerVideoOutput
+ @abstract AVPlayerVideoOutput offers a way to attach to an AVPlayer and receive video frames and video-related data vended through CMTaggedBufferGroups.
+ @discussion AVPlayerVideoOutput can be attached to an AVPlayer using AVPlayer's method addVideoOutput:
+ Note: An AVPlayerVideoOutput can only be attached to a single player at a time, attempting to attach to multiple player will result in an exception being thrown.
+ */
+API_AVAILABLE(macos(14.2), ios(17.2), tvos(17.2), watchos(10.2), visionos(1.1))
+@interface AVPlayerVideoOutput : NSObject
+AV_INIT_UNAVAILABLE
+
+/*!
+ @method initWithSpecification:
+ @abstract Creates an instance of AVPlayerVideoOutput, initialized with the specified video output specification.
+ @param specification
+ An instance of AVVideoOutputSpecification, used to recommend data channels to the AVPlayer associated with this AVPlayerVideoOutput.
+ The tag collections owned by the AVVideoOutputSpecification will be given a priority based on their position in the array which they are held by AVVideoOutputSpecification, meaning position i takes priority over position i+1.
+ This means that the player will first check if the tag collection at index 0 matches the shape of the current item's data channels.
+ If the item's data channels would not be able satisfy the shape of the requested tag collection, it will fall back to the next collection and repeat this process.
+ This continues until a tag collection or set of tag collection can be selected, otherwise if no collections match the shape of the item’s data channels then samples cannot be vended for that item.
+ @result An instance of AVPlayerVideoOutput.
+ @discussion Pixel buffer attributes will be selected from the input AVVideoOutputSpecification based on the data channels selected for an item.
+ If no pixel buffer attributes were set for the selected tag collection, then the default pixel buffer attributes from the AVVideoOutputSpecification will be used if those were set.
+ */
+- (instancetype)initWithSpecification:(AVVideoOutputSpecification *)specification;
+
+/*!
+ @method copyTaggedBufferGroupForHostTime:presentationTimeStamp:activeConfiguration
+ @abstract Retrieves a tagged buffer group that is appropriate for display at the specified host time.
+ @param hostTime
+ A CMTime that expresses a desired host time.
+ @param presentationTimeStamp
+ On return points to a CMTime whose value is the presentation time in terms of the corresponding AVPlayerItem's timebase for the copied tagged buffer group, or kCMTimeInvalid if no sample is available for the provided hostTime.
+ Note: This timestamp is in terms of the timebase of the AVPlayerItem for which this sample is associated.
+ @param activeConfiguration
+ On return points to the active configuration associated with the copied tagged buffer group, or nil, if no sample is available for the provided hostTime.
+ @result A tagged buffer group for the specified host time if a sample is available, and NULL otherwise.
+ @discussion The client is responsible for releasing the returned CMTaggedBufferGroup.
+ */
+- (CMTaggedBufferGroupRef _Nullable)copyTaggedBufferGroupForHostTime:(CMTime)hostTime
+ presentationTimeStamp:(CMTime * _Nullable)presentationTimeStampOut
+ activeConfiguration:(AVPlayerVideoOutputConfiguration * _Nullable * _Nullable)activeConfigurationOut NS_REFINED_FOR_SWIFT CF_RETURNS_RETAINED;
+@end
+
+#pragma mark - CMTagCollectionExtensions Extensions
+/*!
+ @enum CMTagCollectionVideoOutputPreset
+ @discussion Video output presets supported by CMTagCollectionCreateWithVideoOutputPreset.
+ @constant kCMTagCollectionVideoOutputPreset_Monoscopic
+ Used for video output where there is no stereo view, e.g. kCMTagStereoNone.
+ @constant kCMTagCollectionVideoOutputPreset_Stereoscopic
+ Used for video output where there are two stereo views, for both left and right eyes, e.g. kCMTagStereoLeftAndRight.
+ */
+typedef CF_ENUM(uint32_t, CMTagCollectionVideoOutputPreset) {
+ kCMTagCollectionVideoOutputPreset_Monoscopic,
+ kCMTagCollectionVideoOutputPreset_Stereoscopic,
+} API_AVAILABLE(macos(14.2), ios(17.2), tvos(17.2), watchos(10.2), visionos(1.1))
+ CF_REFINED_FOR_SWIFT;
+
+/*!
+ @function CMTagCollectionCreateWithVideoOutputPreset
+ @abstract Creates a CMTagCollection with the required tags to describe the specified video output requirements.
+ @discussion Convenience constructor to create a CMTagCollection with all of the required tags for use with video output interfaces.
+ @param allocator
+ CFAllocator to use to create the collection and internal data structures.
+ @param preset
+ CMTagCollectionVideoOutputPreset representing the desired video output scenario.
+ @param newCollectionOut
+ Address of a location to the newly created CMTagCollection. The client is responsible for releasing the returned CMTagCollection.
+ @result noErr if successful. Otherwise, an error describing why a tag collection could not be created.
+ */
+CM_EXPORT OSStatus CMTagCollectionCreateWithVideoOutputPreset(
+ CFAllocatorRef CM_NULLABLE allocator,
+ CMTagCollectionVideoOutputPreset preset,
+ CM_RETURNS_RETAINED_PARAMETER CMTagCollectionRef CM_NULLABLE * CM_NONNULL newCollectionOut)
+ API_AVAILABLE(macos(14.2), ios(17.2), tvos(17.2), watchos(10.2), visionos(1.1))
+ CF_REFINED_FOR_SWIFT;
+
+/*!
+ @class AVVideoOutputSpecification
+ @abstract AVVideoOutputSpecification offers a way to package CMTagCollections together with pixel buffer attributes. Allowing for direct association between pixel buffer attributes and specific tag collections, as well as default pixel buffer attributes which can be associated with all tag collections which do not have a specified mapping.
+ @discussion For more information about working with CMTagCollections and CMTags first look at <CoreMedia/CMTagCollection.h>
+ */
+NS_SWIFT_NONSENDABLE
+API_AVAILABLE(macos(14.2), ios(17.2), tvos(17.2), watchos(10.2), visionos(1.1))
+@interface AVVideoOutputSpecification : NSObject<NSCopying>
+AV_INIT_UNAVAILABLE
+
+/*!
+ @method initWithTagCollections:
+ @abstract Creates an instance of AVVideoOutputSpecification initialized with the specified tag collections.
+ @param tagCollections
+ Expects a non-empty array of CMTagCollections. Tag collections are given priority based on their position in the array, where position i take priority over position i+1.
+ @discussion This method throws an exception for the following reasons:
+ - tagCollections is nil or has a count of 0.
+ - tagCollections contains elements that are not of the type CMTagCollection.
+ */
+- (instancetype)initWithTagCollections:(NSArray *)tagCollections NS_DESIGNATED_INITIALIZER NS_REFINED_FOR_SWIFT;
+
+/*!
+ @method setOutputPixelBufferAttributes:forTagCollection:
+ @abstract Specifies a mapping between a tag collection and a set of pixel buffer attributes.
+ @param pixelBufferAttributes
+ The client requirements for CVPixelBuffers related to the tags in tagCollection, expressed using the constants in <CoreVideo/CVPixelBuffer.h>.
+ @param tagCollection
+ A single tag collection for which these pixel buffer attributes should map to.
+ @discussion If this method is called twice on the same tag collection, the first requested pixel buffer attributes will be overridden.
+ */
+- (void)setOutputPixelBufferAttributes:(NSDictionary<NSString *, id> *)pixelBufferAttributes forTagCollection:(CMTagCollectionRef)tagCollection NS_REFINED_FOR_SWIFT;
+/*!
+ @property preferredTagCollections
+ @abstract Tag collections held by AVTaggedVideoOutputSpecification.
+ @discussion Returns an array of CMTagCollections.
+ */
+@property (nonatomic, copy, readonly) NSArray *preferredTagCollections NS_REFINED_FOR_SWIFT;
+
+/*!
+ @property defaultPixelBufferAttributes
+ @abstract The default client requirements for CVPixelBuffers related to all tag collections not explicitly set with setOutputPixelBufferAttributes:forTagCollection:, expressed using the constants in <CoreVideo/CVPixelBuffer.h>.
+ @discussion NSDictionary where keys are of type NSString, values should match the type specified by the corresponding keys documentation in <CoreVideo/CVPixelBuffer.h>
+ */
+@property (nonatomic, readwrite, copy, nullable) NSDictionary<NSString *, id> *defaultPixelBufferAttributes;
+
+@end
+
+/*!
+ @class AVPlayerVideoOutputConfiguration
+ @abstract An AVPlayerVideoOutputConfiguration carries an identifier for the AVPlayerItem the configuration is associated with as well as presentation settings for that item.
+ */
+API_AVAILABLE(macos(14.2), ios(17.2), tvos(17.2), watchos(10.2), visionos(1.1))
+@interface AVPlayerVideoOutputConfiguration : NSObject
+AV_INIT_UNAVAILABLE
+
+/*!
+ @property sourcePlayerItem
+ @abstract The AVPlayerItem which is the source of this configuration.
+ @discussion This AVPlayerItem can be seen as the source of all samples this configuration vended alongside.
+ */
+@property (nonatomic, readonly, weak) AVPlayerItem *sourcePlayerItem;
+
+/*!
+ @property dataChannelDescriptions
+ @abstract List of data channels, represented as CMTagCollections, selected for this configuration.
+ @discussion Returns an Array of CMTagCollections
+ */
+@property (nonatomic, readonly, copy) NSArray *dataChannelDescriptions NS_REFINED_FOR_SWIFT;
+
+/*!
+ @property activationTime
+ @abstract Host time when this configuration became active on the player the vending output is attached to.
+ */
+@property (nonatomic, readonly) CMTime activationTime;
+
+@end
+
+NS_ASSUME_NONNULL_END
+
+#else
+#import <AVFCore/AVPlayerOutput.h>
+#endif
diff -ruN /Applications/Xcode_15.0.0.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVVideoComposition.h /Applications/Xcode_15.1.0-beta2.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVVideoComposition.h
--- /Applications/Xcode_15.0.0.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVVideoComposition.h 2023-08-19 00:01:26
+++ /Applications/Xcode_15.1.0-beta2.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVVideoComposition.h 2023-09-22 22:50:10
@@ -214,7 +214,7 @@
The default CIContext has the following properties:
- iOS: Device RGB color space
- - OS X: sRGB color space
+ - macOS: sRGB color space
Example usage:
@@ -253,7 +253,7 @@
The default CIContext has the following properties:
- iOS: Device RGB color space
- - OS X: sRGB color space
+ - macOS: sRGB color space
Example usage:
@@ -505,7 +505,7 @@
The default CIContext has the following properties:
- iOS: Device RGB color space
- - OS X: sRGB color space
+ - macOS: sRGB color space
Example usage:
@@ -542,7 +542,7 @@
The default CIContext has the following properties:
- iOS: Device RGB color space
- - OS X: sRGB color space
+ - macOS: sRGB color space
Example usage:
diff -ruN /Applications/Xcode_15.0.0.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVVideoSettings.h /Applications/Xcode_15.1.0-beta2.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVVideoSettings.h
--- /Applications/Xcode_15.0.0.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVVideoSettings.h 2023-08-16 00:21:34
+++ /Applications/Xcode_15.1.0-beta2.app/Contents/Developer/Platforms/AppleTVOS.platform/Developer/SDKs/AppleTVOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVVideoSettings.h 2023-09-22 22:50:10
@@ -69,7 +69,7 @@
@discussion
The value for this key is an NSDictionary containing AVVideoPixelAspectRatio*Key keys. If no value is specified for this key, the default value for the codec is used. Usually this is 1:1, meaning square pixels.
- Note that prior to OS X 10.9 and iOS 7.0, this key could only be specified as part of the dictionary given for AVVideoCompressionPropertiesKey. As of OS X 10.9 and iOS 7.0, the top level of an AVVideoSettings dictionary is the preferred place to specify this key.
+ Note that prior to macOS 10.9 and iOS 7.0, this key could only be specified as part of the dictionary given for AVVideoCompressionPropertiesKey. As of macOS 10.9 and iOS 7.0, the top level of an AVVideoSettings dictionary is the preferred place to specify this key.
*/
AVF_EXPORT NSString *const AVVideoPixelAspectRatioKey API_AVAILABLE(macos(10.7), ios(4.0), tvos(9.0)) API_UNAVAILABLE(watchos);
AVF_EXPORT NSString *const AVVideoPixelAspectRatioHorizontalSpacingKey /* NSNumber */ API_AVAILABLE(macos(10.7), ios(4.0), tvos(9.0)) API_UNAVAILABLE(watchos);
@@ -83,7 +83,7 @@
If no clean aperture region is specified, the entire frame will be displayed during playback.
- Note that prior to OS X 10.9 and iOS 7.0, this key could only be specified as part of the dictionary given for AVVideoCompressionPropertiesKey. As of OS X 10.9 and iOS 7.0, the top level of an AVVideoSettings dictionary is the preferred place to specify this key.
+ Note that prior to macOS 10.9 and iOS 7.0, this key could only be specified as part of the dictionary given for AVVideoCompressionPropertiesKey. As of macOS 10.9 and iOS 7.0, the top level of an AVVideoSettings dictionary is the preferred place to specify this key.
*/
AVF_EXPORT NSString *const AVVideoCleanApertureKey API_AVAILABLE(macos(10.7), ios(4.0), tvos(9.0)) API_UNAVAILABLE(watchos);
AVF_EXPORT NSString *const AVVideoCleanApertureWidthKey /* NSNumber */ API_AVAILABLE(macos(10.7), ios(4.0), tvos(9.0)) API_UNAVAILABLE(watchos);
@@ -264,7 +264,7 @@
Most keys can only be used for certain decoders. Look at individual keys for details.
*/
-AVF_EXPORT NSString *const AVVideoDecompressionPropertiesKey /* NSDictionary */ API_AVAILABLE(macos(10.13)) API_UNAVAILABLE(ios, tvos, watchos);
+AVF_EXPORT NSString *const AVVideoDecompressionPropertiesKey /* NSDictionary */ API_AVAILABLE(macos(10.13), ios(17.0)) API_UNAVAILABLE(tvos, watchos);
/*!
@constant AVVideoEncoderSpecificationKey
- README
- xcode13.0 Binding Status
- xcode13.1 Binding Status
- xcode13.2 Binding Status
- xcode13.3 Binding Status
- xcode13.4 Binding Status
- xcode14.0 Binding Status
- xcode14.1 Binding Status
- xcode14.2 Binding Status
- xcode14.3 Binding Status
- xcode15.0 Binding Status
- xcode15.1 Binding Status
- xcode15.3 Binding Status
- xcode15.4 Binding Status
- xcode16.0 Binding Status
- xcode16.1 Binding Status
- xcode16.2 Binding Status