Skip to content

Commit 8d6f002

Browse files
committed
Merge branch 'bugfix/dedup-partial-aot-compilation' of github.com:xamarin/xamarin-macios into bugfix/dedup-partial-aot-compilation
2 parents 2d8cb06 + 0d42329 commit 8d6f002

File tree

481 files changed

+23788
-2153
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

481 files changed

+23788
-2153
lines changed

docs/api/ARKit/ARSession.xml

Lines changed: 16 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,16 @@
1+
<Documentation>
2+
<Docs DocId="T:ARKit.ARSession">
3+
<summary>Manages the camera capture, motion processing, and image analysis necessary to create a mixed-reality experience.</summary>
4+
<remarks>
5+
<para>An <see cref="T:ARKit.ARSession" /> object represents the system resources required for a mixed-reality experience. The <see cref="M:ARKit.ARSession.Run(ARKit.ARConfiguration,ARKit.ARSessionRunOptions)" /> method must be passed an <format type="text/html"><a href="https://docs.microsoft.com/en-us/search/index?search=ARKit%20ARSession%20Configuration&amp;scope=Xamarin" title="T:ARKit.ARSessionConfiguration">T:ARKit.ARSessionConfiguration</a></format> object that controls specific ebhaviors. </para>
6+
<para>Developers who use the <see cref="T:ARKit.ARSCNView" /> to present their AR imagery do not need to instantiate their own <see cref="T:ARKit.ARSession" /> object but instead should call <see cref="M:ARKit.ARSession.Run(ARKit.ARConfiguration,ARKit.ARSessionRunOptions)" /> on the <see cref="P:ARKit.ARSCNView.Session" /> property. For example:</para>
7+
<example>
8+
<code lang="csharp lang-csharp"><![CDATA[
9+
var arView = new ARSCNView();
10+
var arConfig = new ARWorldTrackingSessionConfiguration { PlaneDetection = ARPlaneDetection.Horizontal };
11+
arView.Session.Run (arConfig);
12+
]]></code>
13+
</example>
14+
</remarks>
15+
</Docs>
16+
</Documentation>

docs/api/AVFoundation/AVAsset.xml

Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,14 @@
1+
<Documentation>
2+
<Docs DocId="T:AVFoundation.AVAsset">
3+
<summary>Base class for timed video and audio.</summary>
4+
<remarks>
5+
<para>An <see cref="T:AVFoundation.AVAsset" /> represents one or more media assets. These are held in its <see cref="P:AVFoundation.AVAsset.Tracks" /> property. Additionally, <see cref="T:AVFoundation.AVAsset" />s include metadata, track grouping, and preferences about the media.</para>
6+
<para>Because media assets such as movies are large, instantiating an <see cref="T:AVFoundation.AVAsset" /> will not automatically load the file. Properties are loaded when they are queried or via explicit calls to <see cref="M:AVFoundation.AVAsset.LoadValuesTaskAsync(System.String[])" /> or <see cref="M:AVFoundation.AVAsset.LoadValuesAsynchronously(System.String[],System.Action)" />.</para>
7+
<para>During playback, the current presentation state of an <see cref="T:AVFoundation.AVAsset" /> is represented by an <see cref="T:AVFoundation.AVPlayerItem" /> object, and the playback is controlled by a <see cref="T:AVFoundation.AVPlayer" />:</para>
8+
<para>
9+
<img href="~/AVFoundation/_images/AVFoundation.AssetPlayerItemPlayer.png" alt="UML Class Diagram illustrating classes relating to AVAsset" />
10+
</para>
11+
</remarks>
12+
<related type="externalDocumentation" href="https://developer.apple.com/library/ios/documentation/AVFoundation/Reference/AVAsset_Class/index.html">Apple documentation for <c>AVAsset</c></related>
13+
</Docs>
14+
</Documentation>
Lines changed: 29 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,29 @@
1+
<Documentation>
2+
<Docs DocId="T:AVFoundation.AVAudioEnvironmentDistanceAttenuationModel">
3+
<summary>Enumerates attenuation models used by <see cref="T:AVFoundation.AVAudioEnvironmentDistanceAttenuationParameters" />.</summary>
4+
<remarks>
5+
<para>Graph of <c>Gain</c> as Distance ranges from 0 to 10 with: <c>ReferenceDistance = 5</c>, <c>RolloffFactor = 0.5</c>, and <c>MaximumDistance = 20</c></para>
6+
<para>
7+
<see cref="F:AVFoundation.AVAudioEnvironmentDistanceAttenuationModel.Exponential" />
8+
</para>
9+
<para>
10+
<img href="~/AVFoundation/_images/AVFoundation.AVAudioEnvironmentDistanceAttenuationModel.Exponential.png" alt="Graph of exponential attenuation">
11+
</img>
12+
</para>
13+
<para>
14+
<see cref="F:AVFoundation.AVAudioEnvironmentDistanceAttenuationModel.Inverse" />
15+
</para>
16+
<para>
17+
<img href="~/AVFoundation/_images/AVFoundation.AVAudioEnvironmentDistanceAttenuationModel.Inverse.png" alt="Graph of inverse attenuation">
18+
</img>
19+
</para>
20+
<para>
21+
<see cref="F:AVFoundation.AVAudioEnvironmentDistanceAttenuationModel.Linear" />
22+
</para>
23+
<para>
24+
<img href="~/AVFoundation/_images/AVFoundation.AVAudioEnvironmentDistanceAttenuationModel.Linear.png" alt="Graph of linear attenuation">
25+
</img>
26+
</para>
27+
</remarks>
28+
</Docs>
29+
</Documentation>
Lines changed: 29 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,29 @@
1+
<Documentation>
2+
<Docs DocId="T:AVFoundation.AVAudioRecorder">
3+
<summary>Audio recording class.</summary>
4+
<remarks>
5+
<para>
6+
To create instances of this class use the factory method <format type="text/html"><a href="https://docs.microsoft.com/en-us/search/index?search=AVFoundation%20AVRecorder%20To%20Url(%20Foundation%20NSUrl%20, %20AVFoundation%20AVAudio%20Recorder%20Settings%20,Foundation%20NSError%20)&amp;scope=Xamarin" title="M:AVFoundation.AVRecorder.ToUrl(Foundation.NSUrl, AVFoundation.AVAudioRecorderSettings,Foundation.NSError)">M:AVFoundation.AVRecorder.ToUrl(Foundation.NSUrl, AVFoundation.AVAudioRecorderSettings,Foundation.NSError)</a></format></para>
7+
<example>
8+
<code lang="csharp lang-csharp"><![CDATA[
9+
var settings = new AVAudioRecorderSettings () {
10+
AudioFormat = AudioFormatType.LinearPCM,
11+
AudioQuality = AVAudioQuality.High,
12+
SampleRate = 44100f,
13+
NumberChannels = 1
14+
};
15+
var recorder = AVAudioRecorder.ToUrl (url, settings, out error);
16+
if (recorder == null){
17+
Console.WriteLine (error);
18+
return;
19+
}
20+
recorder.PrepareToRecord ();
21+
recorder.Record ();
22+
]]></code>
23+
</example>
24+
</remarks>
25+
<related type="recipe" href="https://developer.xamarin.com/ios/Recipes/Media/Sound/Play_Sound">Play Sound</related>
26+
<related type="recipe" href="https://developer.xamarin.com/ios/Recipes/Media/Sound/Record_Sound">Record Sound</related>
27+
<related type="externalDocumentation" href="https://developer.apple.com/library/ios/documentation/AVFoundation/Reference/AVAudioRecorder_ClassReference/index.html">Apple documentation for <c>AVAudioRecorder</c></related>
28+
</Docs>
29+
</Documentation>
Lines changed: 67 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,67 @@
1+
<Documentation>
2+
<Docs DocId="T:AVFoundation.AVAudioSession">
3+
<summary>Coordinates an audio playback or capture session.</summary>
4+
<remarks>
5+
<para> Application developers should use the singleton object
6+
retrieved by <see cref="M:AVFoundation.AVAudioSession.SharedInstance" />.
7+
</para>
8+
<para>
9+
Because the audio hardware of an iOS device is shared
10+
between all apps, audio settings can only be "preferred" (see
11+
<c>SetPreferred*</c> methods) and the application developer
12+
must account for use-cases where these preferences are
13+
overridden.
14+
</para>
15+
<para>
16+
The interaction of an app with other apps and system
17+
services is determined by your audio category. You can use the <see cref="M:AVFoundation.AVAudioSession.SetCategory(System.String,System.String,AVFoundation.AVAudioSessionRouteSharingPolicy,AVFoundation.AVAudioSessionCategoryOptions,Foundation.NSError@)" /> method to set this
18+
</para>
19+
<para>
20+
You should also control the Mode (using <see cref="M:AVFoundation.AVAudioSession.SetMode(Foundation.NSString,Foundation.NSError@)" /> to
21+
describe how your application will use audio.
22+
23+
</para>
24+
<para>
25+
As is common in AV Foundation, many methods in <see cref="T:AVFoundation.AVAudioSession" /> are
26+
asynchronous and properties may take some time to reflect
27+
their final status. Application developers should be familiar
28+
with asynchronous programming techniques.
29+
</para>
30+
<para>
31+
The <see cref="T:AVFoundation.AVAudioSession" />,
32+
like the <see cref="T:AVFoundation.AVCaptureSession" /> and <see cref="T:AVFoundation.AVAssetExportSession" /> is a
33+
coordinating object between some number of <see cref="P:AVFoundation.AVAudioSession.InputDataSources" />
34+
and <see cref="P:AVFoundation.AVAudioSession.OutputDataSources" />.
35+
</para>
36+
<para>
37+
You can register to a few notifications that are posted by the audio system, by using the convenience methods in <see cref="T:AVFoundation.AVAudioSession.Notifications" />.
38+
39+
</para>
40+
<example>
41+
<code lang="csharp lang-csharp"><![CDATA[
42+
void Setup ()
43+
{
44+
AVAudioSession.SharedInstance ().Init ();
45+
NSError error;
46+
if (!AVAudioSession.SharedInstance ().SetCategory (AVAudioSessionCategory.Playback, out error)) {
47+
ReportError (error);
48+
return;
49+
}
50+
AVAudioSession.Notifications.ObserveInterruption (ToneInterruptionListener);
51+
52+
if (!AVAudioSession.SharedInstance ().SetActive (true, out error)) {
53+
ReportError (error);
54+
return;
55+
}
56+
57+
void ToneInterruptionListener (object sender, AVAudioSessionInterruptionEventArgs interruptArgs)
58+
{
59+
//
60+
}
61+
}
62+
]]></code>
63+
</example>
64+
</remarks>
65+
<related type="externalDocumentation" href="https://developer.apple.com/library/ios/documentation/AVFoundation/Reference/AVAudioSession_ClassReference/index.html">Apple documentation for <c>AVAudioSession</c></related>
66+
</Docs>
67+
</Documentation>
Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,11 @@
1+
<Documentation>
2+
<Docs DocId="T:AVFoundation.AVCaptureConnection">
3+
<summary>The link between capture input and capture output objects during a capture session.</summary>
4+
<remarks>
5+
<para>A <see cref="T:AVFoundation.AVCaptureConnection" /> encapsulates the link between an <see cref="T:AVFoundation.AVCaptureInput" /> (more specifically, between an individual <see cref="T:AVFoundation.AVCaptureInputPort" /> in the <see cref="P:AVFoundation.AVCaptureInput.Ports" /> property of the <see cref="T:AVFoundation.AVCaptureInput" /> and the <see cref="T:AVFoundation.AVCaptureOutput" />).</para>
6+
<para>
7+
<see cref="T:AVFoundation.AVCaptureConnection" />s are formed automatically when inputs and outputs are added via <see cref="M:AVFoundation.AVCaptureSession.AddInput(AVFoundation.AVCaptureInput)" /> and <see cref="M:AVFoundation.AVCaptureSession.AddOutput(AVFoundation.AVCaptureOutput)" />.</para>
8+
</remarks>
9+
<related type="externalDocumentation" href="https://developer.apple.com/library/ios/documentation/AVFoundation/Reference/AVCaptureConnection_Class/index.html">Apple documentation for <c>AVCaptureConnection</c></related>
10+
</Docs>
11+
</Documentation>
Lines changed: 185 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,185 @@
1+
<Documentation>
2+
<Docs DocId="T:AVFoundation.AVCaptureSession">
3+
<summary>Coordinates a recording session.</summary>
4+
<remarks>
5+
<para>
6+
The AVCaptureSession object coordinates the recording of video
7+
or audio input and passing the recorded information to one or
8+
more output objects. As the iOS line has advanced, different devices have gained multiple capture devices (in particular, gained multiple cameras). Application developers can use <see cref="M:AVFoundation.AVCaptureDevice.DefaultDeviceWithMediaType(System.String)" /> or <see cref="M:AVFoundation.AVCaptureDevice.DevicesWithMediaType(System.String)" />, passing in the constants defined in <see cref="T:AVFoundation.AVMediaType" />.
9+
</para>
10+
<para>
11+
Configuring capture consists of setting the <see cref="P:AVFoundation.AVCaptureSession.Inputs" /> and <see cref="P:AVFoundation.AVCaptureSession.Outputs" /> properties of the <see cref="T:AVFoundation.AVCaptureSession" />. Notice that multiple <see cref="T:AVFoundation.AVCaptureInput" />s and <see cref="T:AVFoundation.AVCaptureOutput" />s are possible. For instance, to capture both audio and video, one would use two capture inputs:</para>
12+
<example>
13+
<code lang="csharp lang-csharp"><![CDATA[
14+
var session = new AVCaptureSession();
15+
16+
var camera = AVCaptureDevice.DefaultDeviceWithMediaType(AVMediaType.Video);
17+
var mic = AVCaptureDevice.DefaultDeviceWithMediaType(AVMediaType.Audio);
18+
if(camera == null || mic == null){
19+
throw new Exception("Can't find devices");
20+
}
21+
22+
var cameraInput = AVCaptureDeviceInput.FromDevice (camera);
23+
//info.plist _must_ contain NSMicrophoneUsageDescription key
24+
var micInput = AVCaptureDeviceInput.FromDevice (mic);
25+
26+
if(session.CanAddInput(cameraInput)){
27+
session.AddInput(cameraInput);
28+
}
29+
if(session.CanAddInput(micInput)){
30+
session.AddInput(micInput);
31+
}
32+
]]></code>
33+
</example>
34+
<para>Note that permission to access the microphone (and in some regions, the camera) must be given by the user, requiring the developer to add the <c>NSMicrophoneUsageDescription</c> to the application's info.plist file.</para>
35+
<para>Video can be captured directly to file with <see cref="T:AVFoundation.AVCaptureMovieFileOutput" />. However, this class has no display-able data and cannot be used simultaneously with <see cref="T:AVFoundation.AVCaptureVideoDataOutput" />. Instead, application developers can use it in combination with a <see cref="T:AVFoundation.AVCaptureVideoPreviewLayer" />, as shown in the following example:</para>
36+
<example>
37+
<code lang="csharp lang-csharp"><![CDATA[
38+
var layer = new AVCaptureVideoPreviewLayer (session);
39+
layer.VideoGravity = AVLayerVideoGravity.ResizeAspectFill;
40+
41+
var cameraView = new UIView ();
42+
cameraView.Layer.AddSublayer (layer);
43+
44+
var filePath = Path.Combine (Path.GetTempPath (), "temporary.mov");
45+
var fileUrl = NSUrl.FromFilename (filePath);
46+
47+
var movieFileOutput = new AVCaptureMovieFileOutput ();
48+
var recordingDelegate = new MyRecordingDelegate ();
49+
session.AddOutput (movieFileOutput);
50+
51+
movieFileOutput.StartRecordingToOutputFile (fileUrl, recordingDelegate);
52+
]]></code>
53+
</example>
54+
<para>Application developers should note that the function <see cref="M:AVFoundation.AVCaptureFileOutput.StopRecording" /> is asynchronous; developers should wait until the <see cref="M:AVFoundation.AVCaptureFileOutputRecordingDelegate.FinishedRecording(AVFoundation.AVCaptureFileOutput,Foundation.NSUrl,Foundation.NSObject[],Foundation.NSError)" /> delegate method before manipulating the file (for instance, before saving it to the Photos album with <see cref="M:UIKit.UIVideo.SaveToPhotosAlbum(System.String,UIKit.UIVideo.SaveStatus)" /> or <see cref="M:AssetsLibrary.ALAssetsLibrary.WriteVideoToSavedPhotosAlbumAsync(Foundation.NSUrl)" />).</para>
55+
<example>
56+
<code lang="csharp lang-csharp"><![CDATA[
57+
public class MyRecordingDelegate : AVCaptureFileOutputRecordingDelegate
58+
{
59+
public override void FinishedRecording (AVCaptureFileOutput captureOutput, NSUrl outputFileUrl, NSObject [] connections, NSError error)
60+
{
61+
if (UIVideo.IsCompatibleWithSavedPhotosAlbum (outputFileUrl.Path))
62+
{
63+
var library = new ALAssetsLibrary ();
64+
library.WriteVideoToSavedPhotosAlbum (outputFileUrl, (path, e2) =>
65+
{
66+
if (e2 != null)
67+
{
68+
new UIAlertView ("Error", e2.ToString (), null, "OK", null).Show ();
69+
}
70+
else
71+
{
72+
new UIAlertView ("Saved", "Saved to Photos", null, "OK", null).Show ();
73+
File.Delete (outputFileUrl.Path);
74+
}
75+
});
76+
}
77+
else
78+
{
79+
new UIAlertView ("Incompatible", "Incompatible", null, "OK", null).Show ();
80+
}
81+
82+
}
83+
} ]]></code>
84+
</example>
85+
<para>
86+
Application developers can configure one or more output ports for the
87+
captured data, and these can be still frames, video frames
88+
with timing information, audio samples, quicktime movie files, or can be rendered directly to a CoreAnimation layer.
89+
90+
</para>
91+
<para>
92+
Once the input and output components of
93+
the session are set, the actual processing is begun by calling the
94+
<see cref="M:AVFoundation.AVCaptureSession.StartRunning" />
95+
method.
96+
97+
</para>
98+
<example>
99+
<code lang="csharp lang-csharp"><![CDATA[
100+
101+
void SetupCapture ()
102+
/ configure the capture session for low resolution, change this if your code
103+
// can cope with more data or volume
104+
session = new AVCaptureSession () {
105+
SessionPreset = AVCaptureSession.PresetMedium
106+
};
107+
108+
// create a device input and attach it to the session
109+
var captureDevice = AVCaptureDevice.DefaultDeviceWithMediaType (AVMediaType.Video);
110+
var input = AVCaptureDeviceInput.FromDevice (captureDevice);
111+
if (input == null){
112+
Console.WriteLine ("No video input device");
113+
return false;
114+
}
115+
session.AddInput (input);
116+
117+
// create a VideoDataOutput and add it to the sesion
118+
var output = new AVCaptureVideoDataOutput () {
119+
VideoSettings = new AVVideoSettings (CVPixelFormatType.CV32BGRA),
120+
121+
// If you want to cap the frame rate at a given speed, in this sample: 15 frames per second
122+
MinFrameDuration = new CMTime (1, 15)
123+
};
124+
125+
// configure the output
126+
queue = new MonoTouch.CoreFoundation.DispatchQueue ("myQueue");
127+
outputRecorder = new OutputRecorder ();
128+
output.SetSampleBufferDelegateAndQueue (outputRecorder, queue);
129+
session.AddOutput (output);
130+
131+
session.StartRunning ();
132+
}
133+
134+
public class OutputRecorder : AVCaptureVideoDataOutputSampleBufferDelegate {
135+
public override void DidOutputSampleBuffer (AVCaptureOutput captureOutput, CMSampleBuffer sampleBuffer, AVCaptureConnection connection)
136+
{
137+
try {
138+
var image = ImageFromSampleBuffer (sampleBuffer);
139+
140+
// Do something with the image, we just stuff it in our main view.
141+
AppDelegate.ImageView.BeginInvokeOnMainThread (delegate {
142+
AppDelegate.ImageView.Image = image;
143+
});
144+
145+
//
146+
// Although this looks innocent "Oh, he is just optimizing this case away"
147+
// this is incredibly important to call on this callback, because the AVFoundation
148+
// has a fixed number of buffers and if it runs out of free buffers, it will stop
149+
// delivering frames.
150+
//
151+
sampleBuffer.Dispose ();
152+
} catch (Exception e){
153+
Console.WriteLine (e);
154+
}
155+
}
156+
157+
UIImage ImageFromSampleBuffer (CMSampleBuffer sampleBuffer)
158+
{
159+
// Get the CoreVideo image
160+
using (var pixelBuffer = sampleBuffer.GetImageBuffer () as CVPixelBuffer){
161+
// Lock the base address
162+
pixelBuffer.Lock (0);
163+
// Get the number of bytes per row for the pixel buffer
164+
var baseAddress = pixelBuffer.BaseAddress;
165+
int bytesPerRow = pixelBuffer.BytesPerRow;
166+
int width = pixelBuffer.Width;
167+
int height = pixelBuffer.Height;
168+
var flags = CGBitmapFlags.PremultipliedFirst | CGBitmapFlags.ByteOrder32Little;
169+
// Create a CGImage on the RGB colorspace from the configured parameter above
170+
using (var cs = CGColorSpace.CreateDeviceRGB ())
171+
using (var context = new CGBitmapContext (baseAddress,width, height, 8, bytesPerRow, cs, (CGImageAlphaInfo) flags))
172+
using (var cgImage = context.ToImage ()){
173+
pixelBuffer.Unlock (0);
174+
return UIImage.FromImage (cgImage);
175+
}
176+
}
177+
}
178+
}
179+
180+
]]></code>
181+
</example>
182+
</remarks>
183+
<related type="externalDocumentation" href="https://developer.apple.com/library/ios/documentation/AVFoundation/Reference/AVCaptureSession_Class/index.html">Apple documentation for <c>AVCaptureSession</c></related>
184+
</Docs>
185+
</Documentation>

0 commit comments

Comments
 (0)