[camera_windows] Support camera on windows desktop platform (#4641)

This PR adds a Windows Desktop implementation of [camera plugin](https://pub.dev/packages/camera) using Media Foundation and CaptureEngine interface.

The implementation supports multiple cameras, video preview, and capturing photos and videos. Works with camera plugin and [camera example](https://pub.dev/packages/camera/example).

The current implementation uses FlutterDesktopPixelBuffer, but could be switched to use shared textures or Platform views after those are supported.

Unit tests are partially done; for the CaptureControllerImpl class all falilure code paths are not yet covered.

Still missing some features, like exposure and focus controls. These can be done later with interfaces IAMCameraControl and IAMVideoProcAmp,

Resolves flutter/flutter#41709: [camera] Add Windows support
diff --git a/packages/camera/camera_windows/.gitignore b/packages/camera/camera_windows/.gitignore
new file mode 100644
index 0000000..e9dc58d
--- /dev/null
+++ b/packages/camera/camera_windows/.gitignore
@@ -0,0 +1,7 @@
+.DS_Store
+.dart_tool/
+
+.packages
+.pub/
+
+build/
diff --git a/packages/camera/camera_windows/.metadata b/packages/camera/camera_windows/.metadata
new file mode 100644
index 0000000..5bed526
--- /dev/null
+++ b/packages/camera/camera_windows/.metadata
@@ -0,0 +1,10 @@
+# This file tracks properties of this Flutter project.
+# Used by Flutter tool to assess capabilities and perform upgrades etc.
+#
+# This file should be version controlled and should not be manually edited.
+
+version:
+  revision: 18116933e77adc82f80866c928266a5b4f1ed645
+  channel: stable
+
+project_type: plugin
diff --git a/packages/camera/camera_windows/AUTHORS b/packages/camera/camera_windows/AUTHORS
new file mode 100644
index 0000000..b2178a5
--- /dev/null
+++ b/packages/camera/camera_windows/AUTHORS
@@ -0,0 +1,8 @@
+# Below is a list of people and organizations that have contributed
+# to the Flutter project. Names should be added to the list like so:
+#
+#   Name/Organization <email address>
+
+Google Inc.
+Joonas Kerttula <joonas.kerttula@codemate.com>
+Codemate Ltd.
diff --git a/packages/camera/camera_windows/CHANGELOG.md b/packages/camera/camera_windows/CHANGELOG.md
new file mode 100644
index 0000000..1318780
--- /dev/null
+++ b/packages/camera/camera_windows/CHANGELOG.md
@@ -0,0 +1,3 @@
+## 0.1.0
+
+* Initial release
diff --git a/packages/camera/camera_windows/LICENSE b/packages/camera/camera_windows/LICENSE
new file mode 100644
index 0000000..c6823b8
--- /dev/null
+++ b/packages/camera/camera_windows/LICENSE
@@ -0,0 +1,25 @@
+Copyright 2013 The Flutter Authors. All rights reserved.
+
+Redistribution and use in source and binary forms, with or without modification,
+are permitted provided that the following conditions are met:
+
+    * Redistributions of source code must retain the above copyright
+      notice, this list of conditions and the following disclaimer.
+    * Redistributions in binary form must reproduce the above
+      copyright notice, this list of conditions and the following
+      disclaimer in the documentation and/or other materials provided
+      with the distribution.
+    * Neither the name of Google Inc. nor the names of its
+      contributors may be used to endorse or promote products derived
+      from this software without specific prior written permission.
+
+THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
+ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
+WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
+DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR
+ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
+(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
+LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON
+ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
+SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
diff --git a/packages/camera/camera_windows/README.md b/packages/camera/camera_windows/README.md
new file mode 100644
index 0000000..dc27bcc
--- /dev/null
+++ b/packages/camera/camera_windows/README.md
@@ -0,0 +1,66 @@
+# Camera Windows Plugin
+
+The Windows implementation of [`camera`][camera].
+
+*Note*: This plugin is under development.
+See [missing implementations and limitations](#missing-features-on-the-windows-platform).
+
+## Usage
+
+### Depend on the package
+
+This package is not an [endorsed][endorsed-federated-plugin]
+implementation of the [`camera`][camera] plugin, so you'll need to
+[add it explicitly][install].
+
+## Missing features on the Windows platform
+
+### Device orientation
+
+Device orientation detection
+is not yet implemented: [issue #97540][device-orientation-issue].
+
+### Pause and Resume video recording
+
+Pausing and resuming the video recording
+is not supported due to Windows API limitations.
+
+### Exposure mode, point and offset
+
+Support for explosure mode and offset
+is not yet implemented: [issue #97537][camera-control-issue].
+
+Exposure points are not supported due to
+limitations of the Windows API.
+
+### Focus mode and point
+
+Support for explosure mode and offset
+is not yet implemented: [issue #97537][camera-control-issue].
+
+### Flash mode
+
+Support for flash mode is not yet implemented: [issue #97537][camera-control-issue].
+
+Focus points are not supported due to
+current limitations of the Windows API.
+
+### Streaming of frames
+
+Support for image streaming is not yet implemented: [issue #97542][image-streams-issue].
+
+## Error handling
+
+Camera errors can be listened using the platform's `onCameraError` method.
+
+Listening to errors is important, and in certain situations,
+disposing of the camera is the only way to reset the situation.
+
+<!-- Links -->
+
+[camera]: https://pub.dev/packages/camera
+[endorsed-federated-plugin]: https://flutter.dev/docs/development/packages-and-plugins/developing-packages#endorsed-federated-plugin
+[install]: https://pub.dev/packages/camera_windows/install
+[camera-control-issue]: https://github.com/flutter/flutter/issues/97537
+[device-orientation-issue]: https://github.com/flutter/flutter/issues/97540
+[image-streams-issue]: https://github.com/flutter/flutter/issues/97542
\ No newline at end of file
diff --git a/packages/camera/camera_windows/example/.gitignore b/packages/camera/camera_windows/example/.gitignore
new file mode 100644
index 0000000..0fa6b67
--- /dev/null
+++ b/packages/camera/camera_windows/example/.gitignore
@@ -0,0 +1,46 @@
+# Miscellaneous
+*.class
+*.log
+*.pyc
+*.swp
+.DS_Store
+.atom/
+.buildlog/
+.history
+.svn/
+
+# IntelliJ related
+*.iml
+*.ipr
+*.iws
+.idea/
+
+# The .vscode folder contains launch configuration and tasks you configure in
+# VS Code which you may wish to be included in version control, so this line
+# is commented out by default.
+#.vscode/
+
+# Flutter/Dart/Pub related
+**/doc/api/
+**/ios/Flutter/.last_build_id
+.dart_tool/
+.flutter-plugins
+.flutter-plugins-dependencies
+.packages
+.pub-cache/
+.pub/
+/build/
+
+# Web related
+lib/generated_plugin_registrant.dart
+
+# Symbolication related
+app.*.symbols
+
+# Obfuscation related
+app.*.map.json
+
+# Android Studio will place build artifacts here
+/android/app/debug
+/android/app/profile
+/android/app/release
diff --git a/packages/camera/camera_windows/example/.metadata b/packages/camera/camera_windows/example/.metadata
new file mode 100644
index 0000000..a5584fc
--- /dev/null
+++ b/packages/camera/camera_windows/example/.metadata
@@ -0,0 +1,10 @@
+# This file tracks properties of this Flutter project.
+# Used by Flutter tool to assess capabilities and perform upgrades etc.
+#
+# This file should be version controlled and should not be manually edited.
+
+version:
+  revision: 18116933e77adc82f80866c928266a5b4f1ed645
+  channel: stable
+
+project_type: app
diff --git a/packages/camera/camera_windows/example/README.md b/packages/camera/camera_windows/example/README.md
new file mode 100644
index 0000000..ee73264
--- /dev/null
+++ b/packages/camera/camera_windows/example/README.md
@@ -0,0 +1,3 @@
+# camera_windows_example
+
+Demonstrates how to use the camera_windows plugin.
diff --git a/packages/camera/camera_windows/example/integration_test/camera_test.dart b/packages/camera/camera_windows/example/integration_test/camera_test.dart
new file mode 100644
index 0000000..cda0f40
--- /dev/null
+++ b/packages/camera/camera_windows/example/integration_test/camera_test.dart
@@ -0,0 +1,100 @@
+// Copyright 2013 The Flutter Authors. All rights reserved.
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+import 'dart:async';
+import 'package:async/async.dart';
+import 'package:camera_platform_interface/camera_platform_interface.dart';
+import 'package:flutter/services.dart';
+import 'package:flutter_test/flutter_test.dart';
+import 'package:integration_test/integration_test.dart';
+
+// Note that these integration tests do not currently cover
+// most features and code paths, as they can only be tested if
+// one or more cameras are available in the test environment.
+// Native unit tests with better coverage are available at
+// the native part of the plugin implementation.
+
+void main() {
+  IntegrationTestWidgetsFlutterBinding.ensureInitialized();
+
+  group('initializeCamera', () {
+    testWidgets('throws exception if camera is not created',
+        (WidgetTester _) async {
+      final CameraPlatform camera = CameraPlatform.instance;
+
+      expect(() async => await camera.initializeCamera(1234),
+          throwsA(isA<CameraException>()));
+    });
+  });
+
+  group('takePicture', () {
+    testWidgets('throws exception if camera is not created',
+        (WidgetTester _) async {
+      final CameraPlatform camera = CameraPlatform.instance;
+
+      expect(() async => await camera.takePicture(1234),
+          throwsA(isA<PlatformException>()));
+    });
+  });
+
+  group('startVideoRecording', () {
+    testWidgets('throws exception if camera is not created',
+        (WidgetTester _) async {
+      final CameraPlatform camera = CameraPlatform.instance;
+
+      expect(() async => await camera.startVideoRecording(1234),
+          throwsA(isA<PlatformException>()));
+    });
+  });
+
+  group('stopVideoRecording', () {
+    testWidgets('throws exception if camera is not created',
+        (WidgetTester _) async {
+      final CameraPlatform camera = CameraPlatform.instance;
+
+      expect(() async => await camera.stopVideoRecording(1234),
+          throwsA(isA<PlatformException>()));
+    });
+  });
+
+  group('pausePreview', () {
+    testWidgets('throws exception if camera is not created',
+        (WidgetTester _) async {
+      final CameraPlatform camera = CameraPlatform.instance;
+
+      expect(() async => await camera.pausePreview(1234),
+          throwsA(isA<PlatformException>()));
+    });
+  });
+
+  group('resumePreview', () {
+    testWidgets('throws exception if camera is not created',
+        (WidgetTester _) async {
+      final CameraPlatform camera = CameraPlatform.instance;
+
+      expect(() async => await camera.resumePreview(1234),
+          throwsA(isA<PlatformException>()));
+    });
+  });
+
+  group('onDeviceOrientationChanged', () {
+    testWidgets('emits the initial DeviceOrientationChangedEvent',
+        (WidgetTester _) async {
+      final Stream<DeviceOrientationChangedEvent> eventStream =
+          CameraPlatform.instance.onDeviceOrientationChanged();
+
+      final StreamQueue<DeviceOrientationChangedEvent> streamQueue =
+          StreamQueue<DeviceOrientationChangedEvent>(eventStream);
+
+      expect(
+        await streamQueue.next,
+        equals(
+          const DeviceOrientationChangedEvent(
+            DeviceOrientation.landscapeRight,
+          ),
+        ),
+      );
+    });
+  });
+}
diff --git a/packages/camera/camera_windows/example/lib/main.dart b/packages/camera/camera_windows/example/lib/main.dart
new file mode 100644
index 0000000..b73e00c
--- /dev/null
+++ b/packages/camera/camera_windows/example/lib/main.dart
@@ -0,0 +1,452 @@
+// Copyright 2013 The Flutter Authors. All rights reserved.
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+import 'dart:async';
+
+import 'package:camera_platform_interface/camera_platform_interface.dart';
+import 'package:flutter/material.dart';
+import 'package:flutter/services.dart';
+
+void main() {
+  runApp(MyApp());
+}
+
+/// Example app for Camera Windows plugin.
+class MyApp extends StatefulWidget {
+  @override
+  State<MyApp> createState() => _MyAppState();
+}
+
+class _MyAppState extends State<MyApp> {
+  String _cameraInfo = 'Unknown';
+  List<CameraDescription> _cameras = <CameraDescription>[];
+  int _cameraIndex = 0;
+  int _cameraId = -1;
+  bool _initialized = false;
+  bool _recording = false;
+  bool _recordingTimed = false;
+  bool _recordAudio = true;
+  bool _previewPaused = false;
+  Size? _previewSize;
+  ResolutionPreset _resolutionPreset = ResolutionPreset.veryHigh;
+  StreamSubscription<CameraErrorEvent>? _errorStreamSubscription;
+  StreamSubscription<CameraClosingEvent>? _cameraClosingStreamSubscription;
+
+  @override
+  void initState() {
+    super.initState();
+    WidgetsFlutterBinding.ensureInitialized();
+    _fetchCameras();
+  }
+
+  @override
+  void dispose() {
+    _disposeCurrentCamera();
+    _errorStreamSubscription?.cancel();
+    _errorStreamSubscription = null;
+    _cameraClosingStreamSubscription?.cancel();
+    _cameraClosingStreamSubscription = null;
+    super.dispose();
+  }
+
+  /// Fetches list of available cameras from camera_windows plugin.
+  Future<void> _fetchCameras() async {
+    String cameraInfo;
+    List<CameraDescription> cameras = <CameraDescription>[];
+
+    int cameraIndex = 0;
+    try {
+      cameras = await CameraPlatform.instance.availableCameras();
+      if (cameras.isEmpty) {
+        cameraInfo = 'No available cameras';
+      } else {
+        cameraIndex = _cameraIndex % cameras.length;
+        cameraInfo = 'Found camera: ${cameras[cameraIndex].name}';
+      }
+    } on PlatformException catch (e) {
+      cameraInfo = 'Failed to get cameras: ${e.code}: ${e.message}';
+    }
+
+    if (mounted) {
+      setState(() {
+        _cameraIndex = cameraIndex;
+        _cameras = cameras;
+        _cameraInfo = cameraInfo;
+      });
+    }
+  }
+
+  /// Initializes the camera on the device.
+  Future<void> _initializeCamera() async {
+    assert(!_initialized);
+
+    if (_cameras.isEmpty) {
+      return;
+    }
+
+    int cameraId = -1;
+    try {
+      final int cameraIndex = _cameraIndex % _cameras.length;
+      final CameraDescription camera = _cameras[cameraIndex];
+
+      cameraId = await CameraPlatform.instance.createCamera(
+        camera,
+        _resolutionPreset,
+        enableAudio: _recordAudio,
+      );
+
+      _errorStreamSubscription?.cancel();
+      _errorStreamSubscription = CameraPlatform.instance
+          .onCameraError(cameraId)
+          .listen(_onCameraError);
+
+      _cameraClosingStreamSubscription?.cancel();
+      _cameraClosingStreamSubscription = CameraPlatform.instance
+          .onCameraClosing(cameraId)
+          .listen(_onCameraClosing);
+
+      final Future<CameraInitializedEvent> initialized =
+          CameraPlatform.instance.onCameraInitialized(cameraId).first;
+
+      await CameraPlatform.instance.initializeCamera(
+        cameraId,
+        imageFormatGroup: ImageFormatGroup.unknown,
+      );
+
+      final CameraInitializedEvent event = await initialized;
+      _previewSize = Size(
+        event.previewWidth,
+        event.previewHeight,
+      );
+
+      if (mounted) {
+        setState(() {
+          _initialized = true;
+          _cameraId = cameraId;
+          _cameraIndex = cameraIndex;
+          _cameraInfo = 'Capturing camera: ${camera.name}';
+        });
+      }
+    } on CameraException catch (e) {
+      try {
+        if (cameraId >= 0) {
+          await CameraPlatform.instance.dispose(cameraId);
+        }
+      } on CameraException catch (e) {
+        debugPrint('Failed to dispose camera: ${e.code}: ${e.description}');
+      }
+
+      // Reset state.
+      if (mounted) {
+        setState(() {
+          _initialized = false;
+          _cameraId = -1;
+          _cameraIndex = 0;
+          _previewSize = null;
+          _recording = false;
+          _recordingTimed = false;
+          _cameraInfo =
+              'Failed to initialize camera: ${e.code}: ${e.description}';
+        });
+      }
+    }
+  }
+
+  Future<void> _disposeCurrentCamera() async {
+    if (_cameraId >= 0 && _initialized) {
+      try {
+        await CameraPlatform.instance.dispose(_cameraId);
+
+        if (mounted) {
+          setState(() {
+            _initialized = false;
+            _cameraId = -1;
+            _previewSize = null;
+            _recording = false;
+            _recordingTimed = false;
+            _previewPaused = false;
+            _cameraInfo = 'Camera disposed';
+          });
+        }
+      } on CameraException catch (e) {
+        if (mounted) {
+          setState(() {
+            _cameraInfo =
+                'Failed to dispose camera: ${e.code}: ${e.description}';
+          });
+        }
+      }
+    }
+  }
+
+  Widget _buildPreview() {
+    return CameraPlatform.instance.buildPreview(_cameraId);
+  }
+
+  Future<void> _takePicture() async {
+    final XFile _file = await CameraPlatform.instance.takePicture(_cameraId);
+    _showInSnackBar('Picture captured to: ${_file.path}');
+  }
+
+  Future<void> _recordTimed(int seconds) async {
+    if (_initialized && _cameraId > 0 && !_recordingTimed) {
+      CameraPlatform.instance
+          .onVideoRecordedEvent(_cameraId)
+          .first
+          .then((VideoRecordedEvent event) async {
+        if (mounted) {
+          setState(() {
+            _recordingTimed = false;
+          });
+
+          _showInSnackBar('Video captured to: ${event.file.path}');
+        }
+      });
+
+      await CameraPlatform.instance.startVideoRecording(
+        _cameraId,
+        maxVideoDuration: Duration(seconds: seconds),
+      );
+
+      if (mounted) {
+        setState(() {
+          _recordingTimed = true;
+        });
+      }
+    }
+  }
+
+  Future<void> _toggleRecord() async {
+    if (_initialized && _cameraId > 0) {
+      if (_recordingTimed) {
+        /// Request to stop timed recording short.
+        await CameraPlatform.instance.stopVideoRecording(_cameraId);
+      } else {
+        if (!_recording) {
+          await CameraPlatform.instance.startVideoRecording(_cameraId);
+        } else {
+          final XFile _file =
+              await CameraPlatform.instance.stopVideoRecording(_cameraId);
+
+          _showInSnackBar('Video captured to: ${_file.path}');
+        }
+
+        if (mounted) {
+          setState(() {
+            _recording = !_recording;
+          });
+        }
+      }
+    }
+  }
+
+  Future<void> _togglePreview() async {
+    if (_initialized && _cameraId >= 0) {
+      if (!_previewPaused) {
+        await CameraPlatform.instance.pausePreview(_cameraId);
+      } else {
+        await CameraPlatform.instance.resumePreview(_cameraId);
+      }
+      if (mounted) {
+        setState(() {
+          _previewPaused = !_previewPaused;
+        });
+      }
+    }
+  }
+
+  Future<void> _switchCamera() async {
+    if (_cameras.isNotEmpty) {
+      // select next index;
+      _cameraIndex = (_cameraIndex + 1) % _cameras.length;
+      if (_initialized && _cameraId >= 0) {
+        await _disposeCurrentCamera();
+        await _fetchCameras();
+        if (_cameras.isNotEmpty) {
+          await _initializeCamera();
+        }
+      } else {
+        await _fetchCameras();
+      }
+    }
+  }
+
+  Future<void> _onResolutionChange(ResolutionPreset newValue) async {
+    setState(() {
+      _resolutionPreset = newValue;
+    });
+    if (_initialized && _cameraId >= 0) {
+      // Re-inits camera with new resolution preset.
+      await _disposeCurrentCamera();
+      await _initializeCamera();
+    }
+  }
+
+  Future<void> _onAudioChange(bool recordAudio) async {
+    setState(() {
+      _recordAudio = recordAudio;
+    });
+    if (_initialized && _cameraId >= 0) {
+      // Re-inits camera with new record audio setting.
+      await _disposeCurrentCamera();
+      await _initializeCamera();
+    }
+  }
+
+  void _onCameraError(CameraErrorEvent event) {
+    if (mounted) {
+      _scaffoldMessengerKey.currentState?.showSnackBar(
+          SnackBar(content: Text('Error: ${event.description}')));
+
+      // Dispose camera on camera error as it can not be used anymore.
+      _disposeCurrentCamera();
+      _fetchCameras();
+    }
+  }
+
+  void _onCameraClosing(CameraClosingEvent event) {
+    if (mounted) {
+      _showInSnackBar('Camera is closing');
+    }
+  }
+
+  void _showInSnackBar(String message) {
+    _scaffoldMessengerKey.currentState?.showSnackBar(SnackBar(
+      content: Text(message),
+      duration: const Duration(seconds: 1),
+    ));
+  }
+
+  final GlobalKey<ScaffoldMessengerState> _scaffoldMessengerKey =
+      GlobalKey<ScaffoldMessengerState>();
+
+  @override
+  Widget build(BuildContext context) {
+    final List<DropdownMenuItem<ResolutionPreset>> resolutionItems =
+        ResolutionPreset.values
+            .map<DropdownMenuItem<ResolutionPreset>>((ResolutionPreset value) {
+      return DropdownMenuItem<ResolutionPreset>(
+        value: value,
+        child: Text(value.toString()),
+      );
+    }).toList();
+
+    return MaterialApp(
+      scaffoldMessengerKey: _scaffoldMessengerKey,
+      home: Scaffold(
+        appBar: AppBar(
+          title: const Text('Plugin example app'),
+        ),
+        body: ListView(
+          children: <Widget>[
+            Padding(
+              padding: const EdgeInsets.symmetric(
+                vertical: 5,
+                horizontal: 10,
+              ),
+              child: Text(_cameraInfo),
+            ),
+            if (_cameras.isEmpty)
+              ElevatedButton(
+                onPressed: _fetchCameras,
+                child: const Text('Re-check available cameras'),
+              ),
+            if (_cameras.isNotEmpty)
+              Row(
+                mainAxisAlignment: MainAxisAlignment.center,
+                children: <Widget>[
+                  DropdownButton<ResolutionPreset>(
+                    value: _resolutionPreset,
+                    onChanged: (ResolutionPreset? value) {
+                      if (value != null) {
+                        _onResolutionChange(value);
+                      }
+                    },
+                    items: resolutionItems,
+                  ),
+                  const SizedBox(width: 20),
+                  const Text('Audio:'),
+                  Switch(
+                      value: _recordAudio,
+                      onChanged: (bool state) => _onAudioChange(state)),
+                  const SizedBox(width: 20),
+                  ElevatedButton(
+                    onPressed: _initialized
+                        ? _disposeCurrentCamera
+                        : _initializeCamera,
+                    child:
+                        Text(_initialized ? 'Dispose camera' : 'Create camera'),
+                  ),
+                  const SizedBox(width: 5),
+                  ElevatedButton(
+                    onPressed: _initialized ? _takePicture : null,
+                    child: const Text('Take picture'),
+                  ),
+                  const SizedBox(width: 5),
+                  ElevatedButton(
+                    onPressed: _initialized ? _togglePreview : null,
+                    child: Text(
+                      _previewPaused ? 'Resume preview' : 'Pause preview',
+                    ),
+                  ),
+                  const SizedBox(width: 5),
+                  ElevatedButton(
+                    onPressed: _initialized ? _toggleRecord : null,
+                    child: Text(
+                      (_recording || _recordingTimed)
+                          ? 'Stop recording'
+                          : 'Record Video',
+                    ),
+                  ),
+                  const SizedBox(width: 5),
+                  ElevatedButton(
+                    onPressed: (_initialized && !_recording && !_recordingTimed)
+                        ? () => _recordTimed(5)
+                        : null,
+                    child: const Text(
+                      'Record 5 seconds',
+                    ),
+                  ),
+                  if (_cameras.length > 1) ...<Widget>[
+                    const SizedBox(width: 5),
+                    ElevatedButton(
+                      onPressed: _switchCamera,
+                      child: const Text(
+                        'Switch camera',
+                      ),
+                    ),
+                  ]
+                ],
+              ),
+            const SizedBox(height: 5),
+            if (_initialized && _cameraId > 0 && _previewSize != null)
+              Padding(
+                padding: const EdgeInsets.symmetric(
+                  vertical: 10,
+                ),
+                child: Align(
+                  alignment: Alignment.center,
+                  child: Container(
+                    constraints: const BoxConstraints(
+                      maxHeight: 500,
+                    ),
+                    child: AspectRatio(
+                      aspectRatio: _previewSize!.width / _previewSize!.height,
+                      child: _buildPreview(),
+                    ),
+                  ),
+                ),
+              ),
+            if (_previewSize != null)
+              Center(
+                child: Text(
+                  'Preview size: ${_previewSize!.width.toStringAsFixed(0)}x${_previewSize!.height.toStringAsFixed(0)}',
+                ),
+              ),
+          ],
+        ),
+      ),
+    );
+  }
+}
diff --git a/packages/camera/camera_windows/example/pubspec.yaml b/packages/camera/camera_windows/example/pubspec.yaml
new file mode 100644
index 0000000..aa806a2
--- /dev/null
+++ b/packages/camera/camera_windows/example/pubspec.yaml
@@ -0,0 +1,28 @@
+name: camera_windows_example
+description: Demonstrates how to use the camera_windows plugin.
+publish_to: 'none'
+
+environment:
+  sdk: ">=2.12.0 <3.0.0"
+  flutter: ">=2.8.0"
+
+dependencies:
+  camera_platform_interface: ^2.1.2
+  camera_windows:
+    # When depending on this package from a real application you should use:
+    #   camera_windows: ^x.y.z
+    # See https://dart.dev/tools/pub/dependencies#version-constraints
+    # The example app is bundled with the plugin so we use a path dependency on
+    # the parent directory to use the current plugin's version.
+    path: ../
+  flutter:
+    sdk: flutter
+
+dev_dependencies:
+  flutter_test:
+    sdk: flutter
+  integration_test:
+    sdk: flutter
+
+flutter:
+  uses-material-design: true
diff --git a/packages/camera/camera_windows/example/test_driver/integration_test.dart b/packages/camera/camera_windows/example/test_driver/integration_test.dart
new file mode 100644
index 0000000..4f10f2a
--- /dev/null
+++ b/packages/camera/camera_windows/example/test_driver/integration_test.dart
@@ -0,0 +1,7 @@
+// Copyright 2013 The Flutter Authors. All rights reserved.
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+import 'package:integration_test/integration_test_driver.dart';
+
+Future<void> main() => integrationDriver();
diff --git a/packages/camera/camera_windows/example/windows/.gitignore b/packages/camera/camera_windows/example/windows/.gitignore
new file mode 100644
index 0000000..d492d0d
--- /dev/null
+++ b/packages/camera/camera_windows/example/windows/.gitignore
@@ -0,0 +1,17 @@
+flutter/ephemeral/
+
+# Visual Studio user-specific files.
+*.suo
+*.user
+*.userosscache
+*.sln.docstates
+
+# Visual Studio build-related files.
+x64/
+x86/
+
+# Visual Studio cache files
+# files ending in .cache can be ignored
+*.[Cc]ache
+# but keep track of directories ending in .cache
+!*.[Cc]ache/
diff --git a/packages/camera/camera_windows/example/windows/CMakeLists.txt b/packages/camera/camera_windows/example/windows/CMakeLists.txt
new file mode 100644
index 0000000..28757c7
--- /dev/null
+++ b/packages/camera/camera_windows/example/windows/CMakeLists.txt
@@ -0,0 +1,100 @@
+cmake_minimum_required(VERSION 3.14)
+project(camera_windows_example LANGUAGES CXX)
+
+set(BINARY_NAME "camera_windows_example")
+
+cmake_policy(SET CMP0063 NEW)
+
+set(CMAKE_INSTALL_RPATH "$ORIGIN/lib")
+
+# Configure build options.
+get_property(IS_MULTICONFIG GLOBAL PROPERTY GENERATOR_IS_MULTI_CONFIG)
+if(IS_MULTICONFIG)
+  set(CMAKE_CONFIGURATION_TYPES "Debug;Profile;Release"
+    CACHE STRING "" FORCE)
+else()
+  if(NOT CMAKE_BUILD_TYPE AND NOT CMAKE_CONFIGURATION_TYPES)
+    set(CMAKE_BUILD_TYPE "Debug" CACHE
+      STRING "Flutter build mode" FORCE)
+    set_property(CACHE CMAKE_BUILD_TYPE PROPERTY STRINGS
+      "Debug" "Profile" "Release")
+  endif()
+endif()
+
+set(CMAKE_EXE_LINKER_FLAGS_PROFILE "${CMAKE_EXE_LINKER_FLAGS_RELEASE}")
+set(CMAKE_SHARED_LINKER_FLAGS_PROFILE "${CMAKE_SHARED_LINKER_FLAGS_RELEASE}")
+set(CMAKE_C_FLAGS_PROFILE "${CMAKE_C_FLAGS_RELEASE}")
+set(CMAKE_CXX_FLAGS_PROFILE "${CMAKE_CXX_FLAGS_RELEASE}")
+
+# Use Unicode for all projects.
+add_definitions(-DUNICODE -D_UNICODE)
+
+# Compilation settings that should be applied to most targets.
+function(APPLY_STANDARD_SETTINGS TARGET)
+  target_compile_features(${TARGET} PUBLIC cxx_std_17)
+  target_compile_options(${TARGET} PRIVATE /W4 /WX /wd"4100")
+  target_compile_options(${TARGET} PRIVATE /EHsc)
+  target_compile_definitions(${TARGET} PRIVATE "_HAS_EXCEPTIONS=0")
+  target_compile_definitions(${TARGET} PRIVATE "$<$<CONFIG:Debug>:_DEBUG>")
+endfunction()
+
+set(FLUTTER_MANAGED_DIR "${CMAKE_CURRENT_SOURCE_DIR}/flutter")
+
+# Flutter library and tool build rules.
+add_subdirectory(${FLUTTER_MANAGED_DIR})
+
+# Application build
+add_subdirectory("runner")
+
+# Enable the test target.
+set(include_camera_windows_tests TRUE)
+# Provide an alias for the test target using the name expected by repo tooling.
+add_custom_target(unit_tests DEPENDS camera_windows_test)
+
+# Generated plugin build rules, which manage building the plugins and adding
+# them to the application.
+include(flutter/generated_plugins.cmake)
+
+
+# === Installation ===
+# Support files are copied into place next to the executable, so that it can
+# run in place. This is done instead of making a separate bundle (as on Linux)
+# so that building and running from within Visual Studio will work.
+set(BUILD_BUNDLE_DIR "$<TARGET_FILE_DIR:${BINARY_NAME}>")
+# Make the "install" step default, as it's required to run.
+set(CMAKE_VS_INCLUDE_INSTALL_TO_DEFAULT_BUILD 1)
+if(CMAKE_INSTALL_PREFIX_INITIALIZED_TO_DEFAULT)
+  set(CMAKE_INSTALL_PREFIX "${BUILD_BUNDLE_DIR}" CACHE PATH "..." FORCE)
+endif()
+
+set(INSTALL_BUNDLE_DATA_DIR "${CMAKE_INSTALL_PREFIX}/data")
+set(INSTALL_BUNDLE_LIB_DIR "${CMAKE_INSTALL_PREFIX}")
+
+install(TARGETS ${BINARY_NAME} RUNTIME DESTINATION "${CMAKE_INSTALL_PREFIX}"
+  COMPONENT Runtime)
+
+install(FILES "${FLUTTER_ICU_DATA_FILE}" DESTINATION "${INSTALL_BUNDLE_DATA_DIR}"
+  COMPONENT Runtime)
+
+install(FILES "${FLUTTER_LIBRARY}" DESTINATION "${INSTALL_BUNDLE_LIB_DIR}"
+  COMPONENT Runtime)
+
+if(PLUGIN_BUNDLED_LIBRARIES)
+  install(FILES "${PLUGIN_BUNDLED_LIBRARIES}"
+    DESTINATION "${INSTALL_BUNDLE_LIB_DIR}"
+    COMPONENT Runtime)
+endif()
+
+# Fully re-copy the assets directory on each build to avoid having stale files
+# from a previous install.
+set(FLUTTER_ASSET_DIR_NAME "flutter_assets")
+install(CODE "
+  file(REMOVE_RECURSE \"${INSTALL_BUNDLE_DATA_DIR}/${FLUTTER_ASSET_DIR_NAME}\")
+  " COMPONENT Runtime)
+install(DIRECTORY "${PROJECT_BUILD_DIR}/${FLUTTER_ASSET_DIR_NAME}"
+  DESTINATION "${INSTALL_BUNDLE_DATA_DIR}" COMPONENT Runtime)
+
+# Install the AOT library on non-Debug builds only.
+install(FILES "${AOT_LIBRARY}" DESTINATION "${INSTALL_BUNDLE_DATA_DIR}"
+  CONFIGURATIONS Profile;Release
+  COMPONENT Runtime)
diff --git a/packages/camera/camera_windows/example/windows/flutter/CMakeLists.txt b/packages/camera/camera_windows/example/windows/flutter/CMakeLists.txt
new file mode 100644
index 0000000..b2e4bd8
--- /dev/null
+++ b/packages/camera/camera_windows/example/windows/flutter/CMakeLists.txt
@@ -0,0 +1,103 @@
+cmake_minimum_required(VERSION 3.14)
+
+set(EPHEMERAL_DIR "${CMAKE_CURRENT_SOURCE_DIR}/ephemeral")
+
+# Configuration provided via flutter tool.
+include(${EPHEMERAL_DIR}/generated_config.cmake)
+
+# TODO: Move the rest of this into files in ephemeral. See
+# https://github.com/flutter/flutter/issues/57146.
+set(WRAPPER_ROOT "${EPHEMERAL_DIR}/cpp_client_wrapper")
+
+# === Flutter Library ===
+set(FLUTTER_LIBRARY "${EPHEMERAL_DIR}/flutter_windows.dll")
+
+# Published to parent scope for install step.
+set(FLUTTER_LIBRARY ${FLUTTER_LIBRARY} PARENT_SCOPE)
+set(FLUTTER_ICU_DATA_FILE "${EPHEMERAL_DIR}/icudtl.dat" PARENT_SCOPE)
+set(PROJECT_BUILD_DIR "${PROJECT_DIR}/build/" PARENT_SCOPE)
+set(AOT_LIBRARY "${PROJECT_DIR}/build/windows/app.so" PARENT_SCOPE)
+
+list(APPEND FLUTTER_LIBRARY_HEADERS
+  "flutter_export.h"
+  "flutter_windows.h"
+  "flutter_messenger.h"
+  "flutter_plugin_registrar.h"
+  "flutter_texture_registrar.h"
+)
+list(TRANSFORM FLUTTER_LIBRARY_HEADERS PREPEND "${EPHEMERAL_DIR}/")
+add_library(flutter INTERFACE)
+target_include_directories(flutter INTERFACE
+  "${EPHEMERAL_DIR}"
+)
+target_link_libraries(flutter INTERFACE "${FLUTTER_LIBRARY}.lib")
+add_dependencies(flutter flutter_assemble)
+
+# === Wrapper ===
+list(APPEND CPP_WRAPPER_SOURCES_CORE
+  "core_implementations.cc"
+  "standard_codec.cc"
+)
+list(TRANSFORM CPP_WRAPPER_SOURCES_CORE PREPEND "${WRAPPER_ROOT}/")
+list(APPEND CPP_WRAPPER_SOURCES_PLUGIN
+  "plugin_registrar.cc"
+)
+list(TRANSFORM CPP_WRAPPER_SOURCES_PLUGIN PREPEND "${WRAPPER_ROOT}/")
+list(APPEND CPP_WRAPPER_SOURCES_APP
+  "flutter_engine.cc"
+  "flutter_view_controller.cc"
+)
+list(TRANSFORM CPP_WRAPPER_SOURCES_APP PREPEND "${WRAPPER_ROOT}/")
+
+# Wrapper sources needed for a plugin.
+add_library(flutter_wrapper_plugin STATIC
+  ${CPP_WRAPPER_SOURCES_CORE}
+  ${CPP_WRAPPER_SOURCES_PLUGIN}
+)
+apply_standard_settings(flutter_wrapper_plugin)
+set_target_properties(flutter_wrapper_plugin PROPERTIES
+  POSITION_INDEPENDENT_CODE ON)
+set_target_properties(flutter_wrapper_plugin PROPERTIES
+  CXX_VISIBILITY_PRESET hidden)
+target_link_libraries(flutter_wrapper_plugin PUBLIC flutter)
+target_include_directories(flutter_wrapper_plugin PUBLIC
+  "${WRAPPER_ROOT}/include"
+)
+add_dependencies(flutter_wrapper_plugin flutter_assemble)
+
+# Wrapper sources needed for the runner.
+add_library(flutter_wrapper_app STATIC
+  ${CPP_WRAPPER_SOURCES_CORE}
+  ${CPP_WRAPPER_SOURCES_APP}
+)
+apply_standard_settings(flutter_wrapper_app)
+target_link_libraries(flutter_wrapper_app PUBLIC flutter)
+target_include_directories(flutter_wrapper_app PUBLIC
+  "${WRAPPER_ROOT}/include"
+)
+add_dependencies(flutter_wrapper_app flutter_assemble)
+
+# === Flutter tool backend ===
+# _phony_ is a non-existent file to force this command to run every time,
+# since currently there's no way to get a full input/output list from the
+# flutter tool.
+set(PHONY_OUTPUT "${CMAKE_CURRENT_BINARY_DIR}/_phony_")
+set_source_files_properties("${PHONY_OUTPUT}" PROPERTIES SYMBOLIC TRUE)
+add_custom_command(
+  OUTPUT ${FLUTTER_LIBRARY} ${FLUTTER_LIBRARY_HEADERS}
+    ${CPP_WRAPPER_SOURCES_CORE} ${CPP_WRAPPER_SOURCES_PLUGIN}
+    ${CPP_WRAPPER_SOURCES_APP}
+    ${PHONY_OUTPUT}
+  COMMAND ${CMAKE_COMMAND} -E env
+    ${FLUTTER_TOOL_ENVIRONMENT}
+    "${FLUTTER_ROOT}/packages/flutter_tools/bin/tool_backend.bat"
+      windows-x64 $<CONFIG>
+  VERBATIM
+)
+add_custom_target(flutter_assemble DEPENDS
+  "${FLUTTER_LIBRARY}"
+  ${FLUTTER_LIBRARY_HEADERS}
+  ${CPP_WRAPPER_SOURCES_CORE}
+  ${CPP_WRAPPER_SOURCES_PLUGIN}
+  ${CPP_WRAPPER_SOURCES_APP}
+)
diff --git a/packages/camera/camera_windows/example/windows/flutter/generated_plugins.cmake b/packages/camera/camera_windows/example/windows/flutter/generated_plugins.cmake
new file mode 100644
index 0000000..458d22d
--- /dev/null
+++ b/packages/camera/camera_windows/example/windows/flutter/generated_plugins.cmake
@@ -0,0 +1,24 @@
+#
+# Generated file, do not edit.
+#
+
+list(APPEND FLUTTER_PLUGIN_LIST
+  camera_windows
+)
+
+list(APPEND FLUTTER_FFI_PLUGIN_LIST
+)
+
+set(PLUGIN_BUNDLED_LIBRARIES)
+
+foreach(plugin ${FLUTTER_PLUGIN_LIST})
+  add_subdirectory(flutter/ephemeral/.plugin_symlinks/${plugin}/windows plugins/${plugin})
+  target_link_libraries(${BINARY_NAME} PRIVATE ${plugin}_plugin)
+  list(APPEND PLUGIN_BUNDLED_LIBRARIES $<TARGET_FILE:${plugin}_plugin>)
+  list(APPEND PLUGIN_BUNDLED_LIBRARIES ${${plugin}_bundled_libraries})
+endforeach(plugin)
+
+foreach(ffi_plugin ${FLUTTER_FFI_PLUGIN_LIST})
+  add_subdirectory(flutter/ephemeral/.plugin_symlinks/${ffi_plugin}/windows plugins/${ffi_plugin})
+  list(APPEND PLUGIN_BUNDLED_LIBRARIES ${${ffi_plugin}_bundled_libraries})
+endforeach(ffi_plugin)
diff --git a/packages/camera/camera_windows/example/windows/runner/CMakeLists.txt b/packages/camera/camera_windows/example/windows/runner/CMakeLists.txt
new file mode 100644
index 0000000..adb2052
--- /dev/null
+++ b/packages/camera/camera_windows/example/windows/runner/CMakeLists.txt
@@ -0,0 +1,18 @@
+cmake_minimum_required(VERSION 3.14)
+project(runner LANGUAGES CXX)
+
+add_executable(${BINARY_NAME} WIN32
+  "flutter_window.cpp"
+  "main.cpp"
+  "utils.cpp"
+  "win32_window.cpp"
+  "${FLUTTER_MANAGED_DIR}/generated_plugin_registrant.cc"
+  "Runner.rc"
+  "runner.exe.manifest"
+)
+
+apply_standard_settings(${BINARY_NAME})
+target_compile_definitions(${BINARY_NAME} PRIVATE "NOMINMAX")
+target_link_libraries(${BINARY_NAME} PRIVATE flutter flutter_wrapper_app)
+target_include_directories(${BINARY_NAME} PRIVATE "${CMAKE_SOURCE_DIR}")
+add_dependencies(${BINARY_NAME} flutter_assemble)
diff --git a/packages/camera/camera_windows/example/windows/runner/Runner.rc b/packages/camera/camera_windows/example/windows/runner/Runner.rc
new file mode 100644
index 0000000..f1cfa43
--- /dev/null
+++ b/packages/camera/camera_windows/example/windows/runner/Runner.rc
@@ -0,0 +1,121 @@
+// Microsoft Visual C++ generated resource script.
+//
+#pragma code_page(65001)
+#include "resource.h"
+
+#define APSTUDIO_READONLY_SYMBOLS
+/////////////////////////////////////////////////////////////////////////////
+//
+// Generated from the TEXTINCLUDE 2 resource.
+//
+#include "winres.h"
+
+/////////////////////////////////////////////////////////////////////////////
+#undef APSTUDIO_READONLY_SYMBOLS
+
+/////////////////////////////////////////////////////////////////////////////
+// English (United States) resources
+
+#if !defined(AFX_RESOURCE_DLL) || defined(AFX_TARG_ENU)
+LANGUAGE LANG_ENGLISH, SUBLANG_ENGLISH_US
+
+#ifdef APSTUDIO_INVOKED
+/////////////////////////////////////////////////////////////////////////////
+//
+// TEXTINCLUDE
+//
+
+1 TEXTINCLUDE
+BEGIN
+    "resource.h\0"
+END
+
+2 TEXTINCLUDE
+BEGIN
+    "#include ""winres.h""\r\n"
+    "\0"
+END
+
+3 TEXTINCLUDE
+BEGIN
+    "\r\n"
+    "\0"
+END
+
+#endif    // APSTUDIO_INVOKED
+
+
+/////////////////////////////////////////////////////////////////////////////
+//
+// Icon
+//
+
+// Icon with lowest ID value placed first to ensure application icon
+// remains consistent on all systems.
+IDI_APP_ICON            ICON                    "resources\\app_icon.ico"
+
+
+/////////////////////////////////////////////////////////////////////////////
+//
+// Version
+//
+
+#ifdef FLUTTER_BUILD_NUMBER
+#define VERSION_AS_NUMBER FLUTTER_BUILD_NUMBER
+#else
+#define VERSION_AS_NUMBER 1,0,0
+#endif
+
+#ifdef FLUTTER_BUILD_NAME
+#define VERSION_AS_STRING #FLUTTER_BUILD_NAME
+#else
+#define VERSION_AS_STRING "1.0.0"
+#endif
+
+VS_VERSION_INFO VERSIONINFO
+ FILEVERSION VERSION_AS_NUMBER
+ PRODUCTVERSION VERSION_AS_NUMBER
+ FILEFLAGSMASK VS_FFI_FILEFLAGSMASK
+#ifdef _DEBUG
+ FILEFLAGS VS_FF_DEBUG
+#else
+ FILEFLAGS 0x0L
+#endif
+ FILEOS VOS__WINDOWS32
+ FILETYPE VFT_APP
+ FILESUBTYPE 0x0L
+BEGIN
+    BLOCK "StringFileInfo"
+    BEGIN
+        BLOCK "040904e4"
+        BEGIN
+            VALUE "CompanyName", "com.example" "\0"
+            VALUE "FileDescription", "Demonstrates how to use the camera_windows plugin." "\0"
+            VALUE "FileVersion", VERSION_AS_STRING "\0"
+            VALUE "InternalName", "camera_windows_example" "\0"
+            VALUE "LegalCopyright", "Copyright (C) 2021 com.example. All rights reserved." "\0"
+            VALUE "OriginalFilename", "camera_windows_example.exe" "\0"
+            VALUE "ProductName", "camera_windows_example" "\0"
+            VALUE "ProductVersion", VERSION_AS_STRING "\0"
+        END
+    END
+    BLOCK "VarFileInfo"
+    BEGIN
+        VALUE "Translation", 0x409, 1252
+    END
+END
+
+#endif    // English (United States) resources
+/////////////////////////////////////////////////////////////////////////////
+
+
+
+#ifndef APSTUDIO_INVOKED
+/////////////////////////////////////////////////////////////////////////////
+//
+// Generated from the TEXTINCLUDE 3 resource.
+//
+
+
+/////////////////////////////////////////////////////////////////////////////
+#endif    // not APSTUDIO_INVOKED
diff --git a/packages/camera/camera_windows/example/windows/runner/flutter_window.cpp b/packages/camera/camera_windows/example/windows/runner/flutter_window.cpp
new file mode 100644
index 0000000..8254bd9
--- /dev/null
+++ b/packages/camera/camera_windows/example/windows/runner/flutter_window.cpp
@@ -0,0 +1,65 @@
+// Copyright 2013 The Flutter Authors. All rights reserved.
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+#include "flutter_window.h"
+
+#include <optional>
+
+#include "flutter/generated_plugin_registrant.h"
+
+FlutterWindow::FlutterWindow(const flutter::DartProject& project)
+    : project_(project) {}
+
+FlutterWindow::~FlutterWindow() {}
+
+bool FlutterWindow::OnCreate() {
+  if (!Win32Window::OnCreate()) {
+    return false;
+  }
+
+  RECT frame = GetClientArea();
+
+  // The size here must match the window dimensions to avoid unnecessary surface
+  // creation / destruction in the startup path.
+  flutter_controller_ = std::make_unique<flutter::FlutterViewController>(
+      frame.right - frame.left, frame.bottom - frame.top, project_);
+  // Ensure that basic setup of the controller was successful.
+  if (!flutter_controller_->engine() || !flutter_controller_->view()) {
+    return false;
+  }
+  RegisterPlugins(flutter_controller_->engine());
+  SetChildContent(flutter_controller_->view()->GetNativeWindow());
+  return true;
+}
+
+void FlutterWindow::OnDestroy() {
+  if (flutter_controller_) {
+    flutter_controller_ = nullptr;
+  }
+
+  Win32Window::OnDestroy();
+}
+
+LRESULT
+FlutterWindow::MessageHandler(HWND hwnd, UINT const message,
+                              WPARAM const wparam,
+                              LPARAM const lparam) noexcept {
+  // Give Flutter, including plugins, an opportunity to handle window messages.
+  if (flutter_controller_) {
+    std::optional<LRESULT> result =
+        flutter_controller_->HandleTopLevelWindowProc(hwnd, message, wparam,
+                                                      lparam);
+    if (result) {
+      return *result;
+    }
+  }
+
+  switch (message) {
+    case WM_FONTCHANGE:
+      flutter_controller_->engine()->ReloadSystemFonts();
+      break;
+  }
+
+  return Win32Window::MessageHandler(hwnd, message, wparam, lparam);
+}
diff --git a/packages/camera/camera_windows/example/windows/runner/flutter_window.h b/packages/camera/camera_windows/example/windows/runner/flutter_window.h
new file mode 100644
index 0000000..f1fc669
--- /dev/null
+++ b/packages/camera/camera_windows/example/windows/runner/flutter_window.h
@@ -0,0 +1,37 @@
+// Copyright 2013 The Flutter Authors. All rights reserved.
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+#ifndef RUNNER_FLUTTER_WINDOW_H_
+#define RUNNER_FLUTTER_WINDOW_H_
+
+#include <flutter/dart_project.h>
+#include <flutter/flutter_view_controller.h>
+
+#include <memory>
+
+#include "win32_window.h"
+
+// A window that does nothing but host a Flutter view.
+class FlutterWindow : public Win32Window {
+ public:
+  // Creates a new FlutterWindow hosting a Flutter view running |project|.
+  explicit FlutterWindow(const flutter::DartProject& project);
+  virtual ~FlutterWindow();
+
+ protected:
+  // Win32Window:
+  bool OnCreate() override;
+  void OnDestroy() override;
+  LRESULT MessageHandler(HWND window, UINT const message, WPARAM const wparam,
+                         LPARAM const lparam) noexcept override;
+
+ private:
+  // The project to run.
+  flutter::DartProject project_;
+
+  // The Flutter instance hosted by this window.
+  std::unique_ptr<flutter::FlutterViewController> flutter_controller_;
+};
+
+#endif  // RUNNER_FLUTTER_WINDOW_H_
diff --git a/packages/camera/camera_windows/example/windows/runner/main.cpp b/packages/camera/camera_windows/example/windows/runner/main.cpp
new file mode 100644
index 0000000..755a90b
--- /dev/null
+++ b/packages/camera/camera_windows/example/windows/runner/main.cpp
@@ -0,0 +1,46 @@
+// Copyright 2013 The Flutter Authors. All rights reserved.
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+#include <flutter/dart_project.h>
+#include <flutter/flutter_view_controller.h>
+#include <windows.h>
+
+#include "flutter_window.h"
+#include "utils.h"
+
+int APIENTRY wWinMain(_In_ HINSTANCE instance, _In_opt_ HINSTANCE prev,
+                      _In_ wchar_t* command_line, _In_ int show_command) {
+  // Attach to console when present (e.g., 'flutter run') or create a
+  // new console when running with a debugger.
+  if (!::AttachConsole(ATTACH_PARENT_PROCESS) && ::IsDebuggerPresent()) {
+    CreateAndAttachConsole();
+  }
+
+  // Initialize COM, so that it is available for use in the library and/or
+  // plugins.
+  ::CoInitializeEx(nullptr, COINIT_APARTMENTTHREADED);
+
+  flutter::DartProject project(L"data");
+
+  std::vector<std::string> command_line_arguments = GetCommandLineArguments();
+
+  project.set_dart_entrypoint_arguments(std::move(command_line_arguments));
+
+  FlutterWindow window(project);
+  Win32Window::Point origin(10, 10);
+  Win32Window::Size size(1280, 720);
+  if (!window.CreateAndShow(L"camera_windows_example", origin, size)) {
+    return EXIT_FAILURE;
+  }
+  window.SetQuitOnClose(true);
+
+  ::MSG msg;
+  while (::GetMessage(&msg, nullptr, 0, 0)) {
+    ::TranslateMessage(&msg);
+    ::DispatchMessage(&msg);
+  }
+
+  ::CoUninitialize();
+  return EXIT_SUCCESS;
+}
diff --git a/packages/camera/camera_windows/example/windows/runner/resource.h b/packages/camera/camera_windows/example/windows/runner/resource.h
new file mode 100644
index 0000000..d5d958d
--- /dev/null
+++ b/packages/camera/camera_windows/example/windows/runner/resource.h
@@ -0,0 +1,16 @@
+//{{NO_DEPENDENCIES}}
+// Microsoft Visual C++ generated include file.
+// Used by Runner.rc
+//
+#define IDI_APP_ICON 101
+
+// Next default values for new objects
+//
+#ifdef APSTUDIO_INVOKED
+#ifndef APSTUDIO_READONLY_SYMBOLS
+#define _APS_NEXT_RESOURCE_VALUE 102
+#define _APS_NEXT_COMMAND_VALUE 40001
+#define _APS_NEXT_CONTROL_VALUE 1001
+#define _APS_NEXT_SYMED_VALUE 101
+#endif
+#endif
diff --git a/packages/camera/camera_windows/example/windows/runner/resources/app_icon.ico b/packages/camera/camera_windows/example/windows/runner/resources/app_icon.ico
new file mode 100644
index 0000000..c04e20c
--- /dev/null
+++ b/packages/camera/camera_windows/example/windows/runner/resources/app_icon.ico
Binary files differ
diff --git a/packages/camera/camera_windows/example/windows/runner/runner.exe.manifest b/packages/camera/camera_windows/example/windows/runner/runner.exe.manifest
new file mode 100644
index 0000000..c977c4a
--- /dev/null
+++ b/packages/camera/camera_windows/example/windows/runner/runner.exe.manifest
@@ -0,0 +1,20 @@
+<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
+<assembly xmlns="urn:schemas-microsoft-com:asm.v1" manifestVersion="1.0">
+  <application xmlns="urn:schemas-microsoft-com:asm.v3">
+    <windowsSettings>
+      <dpiAwareness xmlns="http://schemas.microsoft.com/SMI/2016/WindowsSettings">PerMonitorV2</dpiAwareness>
+    </windowsSettings>
+  </application>
+  <compatibility xmlns="urn:schemas-microsoft-com:compatibility.v1">
+    <application>
+      <!-- Windows 10 -->
+      <supportedOS Id="{8e0f7a12-bfb3-4fe8-b9a5-48fd50a15a9a}"/>
+      <!-- Windows 8.1 -->
+      <supportedOS Id="{1f676c76-80e1-4239-95bb-83d0f6d0da78}"/>
+      <!-- Windows 8 -->
+      <supportedOS Id="{4a2f28e3-53b9-4441-ba9c-d69d4a4a6e38}"/>
+      <!-- Windows 7 -->
+      <supportedOS Id="{35138b9a-5d96-4fbd-8e2d-a2440225f93a}"/>
+    </application>
+  </compatibility>
+</assembly>
diff --git a/packages/camera/camera_windows/example/windows/runner/utils.cpp b/packages/camera/camera_windows/example/windows/runner/utils.cpp
new file mode 100644
index 0000000..fb7e945
--- /dev/null
+++ b/packages/camera/camera_windows/example/windows/runner/utils.cpp
@@ -0,0 +1,67 @@
+// Copyright 2013 The Flutter Authors. All rights reserved.
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+#include "utils.h"
+
+#include <flutter_windows.h>
+#include <io.h>
+#include <stdio.h>
+#include <windows.h>
+
+#include <iostream>
+
+void CreateAndAttachConsole() {
+  if (::AllocConsole()) {
+    FILE* unused;
+    if (freopen_s(&unused, "CONOUT$", "w", stdout)) {
+      _dup2(_fileno(stdout), 1);
+    }
+    if (freopen_s(&unused, "CONOUT$", "w", stderr)) {
+      _dup2(_fileno(stdout), 2);
+    }
+    std::ios::sync_with_stdio();
+    FlutterDesktopResyncOutputStreams();
+  }
+}
+
+std::vector<std::string> GetCommandLineArguments() {
+  // Convert the UTF-16 command line arguments to UTF-8 for the Engine to use.
+  int argc;
+  wchar_t** argv = ::CommandLineToArgvW(::GetCommandLineW(), &argc);
+  if (argv == nullptr) {
+    return std::vector<std::string>();
+  }
+
+  std::vector<std::string> command_line_arguments;
+
+  // Skip the first argument as it's the binary name.
+  for (int i = 1; i < argc; i++) {
+    command_line_arguments.push_back(Utf8FromUtf16(argv[i]));
+  }
+
+  ::LocalFree(argv);
+
+  return command_line_arguments;
+}
+
+std::string Utf8FromUtf16(const wchar_t* utf16_string) {
+  if (utf16_string == nullptr) {
+    return std::string();
+  }
+  int target_length =
+      ::WideCharToMultiByte(CP_UTF8, WC_ERR_INVALID_CHARS, utf16_string, -1,
+                            nullptr, 0, nullptr, nullptr);
+  if (target_length == 0) {
+    return std::string();
+  }
+  std::string utf8_string;
+  utf8_string.resize(target_length);
+  int converted_length = ::WideCharToMultiByte(
+      CP_UTF8, WC_ERR_INVALID_CHARS, utf16_string, -1, utf8_string.data(),
+      target_length, nullptr, nullptr);
+  if (converted_length == 0) {
+    return std::string();
+  }
+  return utf8_string;
+}
diff --git a/packages/camera/camera_windows/example/windows/runner/utils.h b/packages/camera/camera_windows/example/windows/runner/utils.h
new file mode 100644
index 0000000..bd81e1e
--- /dev/null
+++ b/packages/camera/camera_windows/example/windows/runner/utils.h
@@ -0,0 +1,23 @@
+// Copyright 2013 The Flutter Authors. All rights reserved.
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+#ifndef RUNNER_UTILS_H_
+#define RUNNER_UTILS_H_
+
+#include <string>
+#include <vector>
+
+// Creates a console for the process, and redirects stdout and stderr to
+// it for both the runner and the Flutter library.
+void CreateAndAttachConsole();
+
+// Takes a null-terminated wchar_t* encoded in UTF-16 and returns a std::string
+// encoded in UTF-8. Returns an empty std::string on failure.
+std::string Utf8FromUtf16(const wchar_t* utf16_string);
+
+// Gets the command line arguments passed in as a std::vector<std::string>,
+// encoded in UTF-8. Returns an empty std::vector<std::string> on failure.
+std::vector<std::string> GetCommandLineArguments();
+
+#endif  // RUNNER_UTILS_H_
diff --git a/packages/camera/camera_windows/example/windows/runner/win32_window.cpp b/packages/camera/camera_windows/example/windows/runner/win32_window.cpp
new file mode 100644
index 0000000..85aa361
--- /dev/null
+++ b/packages/camera/camera_windows/example/windows/runner/win32_window.cpp
@@ -0,0 +1,241 @@
+// Copyright 2013 The Flutter Authors. All rights reserved.
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+#include "win32_window.h"
+
+#include <flutter_windows.h>
+
+#include "resource.h"
+
+namespace {
+
+constexpr const wchar_t kWindowClassName[] = L"FLUTTER_RUNNER_WIN32_WINDOW";
+
+// The number of Win32Window objects that currently exist.
+static int g_active_window_count = 0;
+
+using EnableNonClientDpiScaling = BOOL __stdcall(HWND hwnd);
+
+// Scale helper to convert logical scaler values to physical using passed in
+// scale factor
+int Scale(int source, double scale_factor) {
+  return static_cast<int>(source * scale_factor);
+}
+
+// Dynamically loads the |EnableNonClientDpiScaling| from the User32 module.
+// This API is only needed for PerMonitor V1 awareness mode.
+void EnableFullDpiSupportIfAvailable(HWND hwnd) {
+  HMODULE user32_module = LoadLibraryA("User32.dll");
+  if (!user32_module) {
+    return;
+  }
+  auto enable_non_client_dpi_scaling =
+      reinterpret_cast<EnableNonClientDpiScaling*>(
+          GetProcAddress(user32_module, "EnableNonClientDpiScaling"));
+  if (enable_non_client_dpi_scaling != nullptr) {
+    enable_non_client_dpi_scaling(hwnd);
+    FreeLibrary(user32_module);
+  }
+}
+
+}  // namespace
+
+// Manages the Win32Window's window class registration.
+class WindowClassRegistrar {
+ public:
+  ~WindowClassRegistrar() = default;
+
+  // Returns the singleton registar instance.
+  static WindowClassRegistrar* GetInstance() {
+    if (!instance_) {
+      instance_ = new WindowClassRegistrar();
+    }
+    return instance_;
+  }
+
+  // Returns the name of the window class, registering the class if it hasn't
+  // previously been registered.
+  const wchar_t* GetWindowClass();
+
+  // Unregisters the window class. Should only be called if there are no
+  // instances of the window.
+  void UnregisterWindowClass();
+
+ private:
+  WindowClassRegistrar() = default;
+
+  static WindowClassRegistrar* instance_;
+
+  bool class_registered_ = false;
+};
+
+WindowClassRegistrar* WindowClassRegistrar::instance_ = nullptr;
+
+const wchar_t* WindowClassRegistrar::GetWindowClass() {
+  if (!class_registered_) {
+    WNDCLASS window_class{};
+    window_class.hCursor = LoadCursor(nullptr, IDC_ARROW);
+    window_class.lpszClassName = kWindowClassName;
+    window_class.style = CS_HREDRAW | CS_VREDRAW;
+    window_class.cbClsExtra = 0;
+    window_class.cbWndExtra = 0;
+    window_class.hInstance = GetModuleHandle(nullptr);
+    window_class.hIcon =
+        LoadIcon(window_class.hInstance, MAKEINTRESOURCE(IDI_APP_ICON));
+    window_class.hbrBackground = 0;
+    window_class.lpszMenuName = nullptr;
+    window_class.lpfnWndProc = Win32Window::WndProc;
+    RegisterClass(&window_class);
+    class_registered_ = true;
+  }
+  return kWindowClassName;
+}
+
+void WindowClassRegistrar::UnregisterWindowClass() {
+  UnregisterClass(kWindowClassName, nullptr);
+  class_registered_ = false;
+}
+
+Win32Window::Win32Window() { ++g_active_window_count; }
+
+Win32Window::~Win32Window() {
+  --g_active_window_count;
+  Destroy();
+}
+
+bool Win32Window::CreateAndShow(const std::wstring& title, const Point& origin,
+                                const Size& size) {
+  Destroy();
+
+  const wchar_t* window_class =
+      WindowClassRegistrar::GetInstance()->GetWindowClass();
+
+  const POINT target_point = {static_cast<LONG>(origin.x),
+                              static_cast<LONG>(origin.y)};
+  HMONITOR monitor = MonitorFromPoint(target_point, MONITOR_DEFAULTTONEAREST);
+  UINT dpi = FlutterDesktopGetDpiForMonitor(monitor);
+  double scale_factor = dpi / 96.0;
+
+  HWND window = CreateWindow(
+      window_class, title.c_str(), WS_OVERLAPPEDWINDOW | WS_VISIBLE,
+      Scale(origin.x, scale_factor), Scale(origin.y, scale_factor),
+      Scale(size.width, scale_factor), Scale(size.height, scale_factor),
+      nullptr, nullptr, GetModuleHandle(nullptr), this);
+
+  if (!window) {
+    return false;
+  }
+
+  return OnCreate();
+}
+
+// static
+LRESULT CALLBACK Win32Window::WndProc(HWND const window, UINT const message,
+                                      WPARAM const wparam,
+                                      LPARAM const lparam) noexcept {
+  if (message == WM_NCCREATE) {
+    auto window_struct = reinterpret_cast<CREATESTRUCT*>(lparam);
+    SetWindowLongPtr(window, GWLP_USERDATA,
+                     reinterpret_cast<LONG_PTR>(window_struct->lpCreateParams));
+
+    auto that = static_cast<Win32Window*>(window_struct->lpCreateParams);
+    EnableFullDpiSupportIfAvailable(window);
+    that->window_handle_ = window;
+  } else if (Win32Window* that = GetThisFromHandle(window)) {
+    return that->MessageHandler(window, message, wparam, lparam);
+  }
+
+  return DefWindowProc(window, message, wparam, lparam);
+}
+
+LRESULT
+Win32Window::MessageHandler(HWND hwnd, UINT const message, WPARAM const wparam,
+                            LPARAM const lparam) noexcept {
+  switch (message) {
+    case WM_DESTROY:
+      window_handle_ = nullptr;
+      Destroy();
+      if (quit_on_close_) {
+        PostQuitMessage(0);
+      }
+      return 0;
+
+    case WM_DPICHANGED: {
+      auto newRectSize = reinterpret_cast<RECT*>(lparam);
+      LONG newWidth = newRectSize->right - newRectSize->left;
+      LONG newHeight = newRectSize->bottom - newRectSize->top;
+
+      SetWindowPos(hwnd, nullptr, newRectSize->left, newRectSize->top, newWidth,
+                   newHeight, SWP_NOZORDER | SWP_NOACTIVATE);
+
+      return 0;
+    }
+    case WM_SIZE: {
+      RECT rect = GetClientArea();
+      if (child_content_ != nullptr) {
+        // Size and position the child window.
+        MoveWindow(child_content_, rect.left, rect.top, rect.right - rect.left,
+                   rect.bottom - rect.top, TRUE);
+      }
+      return 0;
+    }
+
+    case WM_ACTIVATE:
+      if (child_content_ != nullptr) {
+        SetFocus(child_content_);
+      }
+      return 0;
+  }
+
+  return DefWindowProc(window_handle_, message, wparam, lparam);
+}
+
+void Win32Window::Destroy() {
+  OnDestroy();
+
+  if (window_handle_) {
+    DestroyWindow(window_handle_);
+    window_handle_ = nullptr;
+  }
+  if (g_active_window_count == 0) {
+    WindowClassRegistrar::GetInstance()->UnregisterWindowClass();
+  }
+}
+
+Win32Window* Win32Window::GetThisFromHandle(HWND const window) noexcept {
+  return reinterpret_cast<Win32Window*>(
+      GetWindowLongPtr(window, GWLP_USERDATA));
+}
+
+void Win32Window::SetChildContent(HWND content) {
+  child_content_ = content;
+  SetParent(content, window_handle_);
+  RECT frame = GetClientArea();
+
+  MoveWindow(content, frame.left, frame.top, frame.right - frame.left,
+             frame.bottom - frame.top, true);
+
+  SetFocus(child_content_);
+}
+
+RECT Win32Window::GetClientArea() {
+  RECT frame;
+  GetClientRect(window_handle_, &frame);
+  return frame;
+}
+
+HWND Win32Window::GetHandle() { return window_handle_; }
+
+void Win32Window::SetQuitOnClose(bool quit_on_close) {
+  quit_on_close_ = quit_on_close;
+}
+
+bool Win32Window::OnCreate() {
+  // No-op; provided for subclasses.
+  return true;
+}
+
+void Win32Window::OnDestroy() {
+  // No-op; provided for subclasses.
+}
diff --git a/packages/camera/camera_windows/example/windows/runner/win32_window.h b/packages/camera/camera_windows/example/windows/runner/win32_window.h
new file mode 100644
index 0000000..d2a7300
--- /dev/null
+++ b/packages/camera/camera_windows/example/windows/runner/win32_window.h
@@ -0,0 +1,99 @@
+// Copyright 2013 The Flutter Authors. All rights reserved.
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+#ifndef RUNNER_WIN32_WINDOW_H_
+#define RUNNER_WIN32_WINDOW_H_
+
+#include <windows.h>
+
+#include <functional>
+#include <memory>
+#include <string>
+
+// A class abstraction for a high DPI-aware Win32 Window. Intended to be
+// inherited from by classes that wish to specialize with custom
+// rendering and input handling
+class Win32Window {
+ public:
+  struct Point {
+    unsigned int x;
+    unsigned int y;
+    Point(unsigned int x, unsigned int y) : x(x), y(y) {}
+  };
+
+  struct Size {
+    unsigned int width;
+    unsigned int height;
+    Size(unsigned int width, unsigned int height)
+        : width(width), height(height) {}
+  };
+
+  Win32Window();
+  virtual ~Win32Window();
+
+  // Creates and shows a win32 window with |title| and position and size using
+  // |origin| and |size|. New windows are created on the default monitor. Window
+  // sizes are specified to the OS in physical pixels, hence to ensure a
+  // consistent size to will treat the width height passed in to this function
+  // as logical pixels and scale to appropriate for the default monitor. Returns
+  // true if the window was created successfully.
+  bool CreateAndShow(const std::wstring& title, const Point& origin,
+                     const Size& size);
+
+  // Release OS resources associated with window.
+  void Destroy();
+
+  // Inserts |content| into the window tree.
+  void SetChildContent(HWND content);
+
+  // Returns the backing Window handle to enable clients to set icon and other
+  // window properties. Returns nullptr if the window has been destroyed.
+  HWND GetHandle();
+
+  // If true, closing this window will quit the application.
+  void SetQuitOnClose(bool quit_on_close);
+
+  // Return a RECT representing the bounds of the current client area.
+  RECT GetClientArea();
+
+ protected:
+  // Processes and route salient window messages for mouse handling,
+  // size change and DPI. Delegates handling of these to member overloads that
+  // inheriting classes can handle.
+  virtual LRESULT MessageHandler(HWND window, UINT const message,
+                                 WPARAM const wparam,
+                                 LPARAM const lparam) noexcept;
+
+  // Called when CreateAndShow is called, allowing subclass window-related
+  // setup. Subclasses should return false if setup fails.
+  virtual bool OnCreate();
+
+  // Called when Destroy is called.
+  virtual void OnDestroy();
+
+ private:
+  friend class WindowClassRegistrar;
+
+  // OS callback called by message pump. Handles the WM_NCCREATE message which
+  // is passed when the non-client area is being created and enables automatic
+  // non-client DPI scaling so that the non-client area automatically
+  // responsponds to changes in DPI. All other messages are handled by
+  // MessageHandler.
+  static LRESULT CALLBACK WndProc(HWND const window, UINT const message,
+                                  WPARAM const wparam,
+                                  LPARAM const lparam) noexcept;
+
+  // Retrieves a class instance pointer for |window|
+  static Win32Window* GetThisFromHandle(HWND const window) noexcept;
+
+  bool quit_on_close_ = false;
+
+  // window handle for top level window.
+  HWND window_handle_ = nullptr;
+
+  // window handle for hosted content.
+  HWND child_content_ = nullptr;
+};
+
+#endif  // RUNNER_WIN32_WINDOW_H_
diff --git a/packages/camera/camera_windows/lib/camera_windows.dart b/packages/camera/camera_windows/lib/camera_windows.dart
new file mode 100644
index 0000000..33f8bfb
--- /dev/null
+++ b/packages/camera/camera_windows/lib/camera_windows.dart
@@ -0,0 +1,433 @@
+// Copyright 2013 The Flutter Authors. All rights reserved.
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+import 'dart:async';
+import 'dart:math';
+
+import 'package:camera_platform_interface/camera_platform_interface.dart';
+import 'package:cross_file/cross_file.dart';
+import 'package:flutter/services.dart';
+import 'package:flutter/widgets.dart';
+import 'package:stream_transform/stream_transform.dart';
+
+/// An implementation of [CameraPlatform] for Windows.
+class CameraWindows extends CameraPlatform {
+  /// Registers the Windows implementation of CameraPlatform.
+  static void registerWith() {
+    CameraPlatform.instance = CameraWindows();
+  }
+
+  /// The method channel used to interact with the native platform.
+  @visibleForTesting
+  final MethodChannel pluginChannel =
+      const MethodChannel('plugins.flutter.io/camera_windows');
+
+  /// Camera specific method channels to allow comminicating with specific cameras.
+  final Map<int, MethodChannel> _cameraChannels = <int, MethodChannel>{};
+
+  /// The controller that broadcasts events coming from handleCameraMethodCall
+  ///
+  /// It is a `broadcast` because multiple controllers will connect to
+  /// different stream views of this Controller.
+  /// This is only exposed for test purposes. It shouldn't be used by clients of
+  /// the plugin as it may break or change at any time.
+  @visibleForTesting
+  final StreamController<CameraEvent> cameraEventStreamController =
+      StreamController<CameraEvent>.broadcast();
+
+  /// Returns a stream of camera events for the given [cameraId].
+  Stream<CameraEvent> _cameraEvents(int cameraId) =>
+      cameraEventStreamController.stream
+          .where((CameraEvent event) => event.cameraId == cameraId);
+
+  @override
+  Future<List<CameraDescription>> availableCameras() async {
+    try {
+      final List<Map<dynamic, dynamic>>? cameras = await pluginChannel
+          .invokeListMethod<Map<dynamic, dynamic>>('availableCameras');
+
+      if (cameras == null) {
+        return <CameraDescription>[];
+      }
+
+      return cameras.map((Map<dynamic, dynamic> camera) {
+        return CameraDescription(
+          name: camera['name'] as String,
+          lensDirection:
+              parseCameraLensDirection(camera['lensFacing'] as String),
+          sensorOrientation: camera['sensorOrientation'] as int,
+        );
+      }).toList();
+    } on PlatformException catch (e) {
+      throw CameraException(e.code, e.message);
+    }
+  }
+
+  @override
+  Future<int> createCamera(
+    CameraDescription cameraDescription,
+    ResolutionPreset? resolutionPreset, {
+    bool enableAudio = false,
+  }) async {
+    try {
+      // If resolutionPreset is not specified, plugin selects the highest resolution possible.
+      final Map<String, dynamic>? reply = await pluginChannel
+          .invokeMapMethod<String, dynamic>('create', <String, dynamic>{
+        'cameraName': cameraDescription.name,
+        'resolutionPreset': _serializeResolutionPreset(resolutionPreset),
+        'enableAudio': enableAudio,
+      });
+
+      if (reply == null) {
+        throw CameraException('System', 'Cannot create camera');
+      }
+
+      return reply['cameraId']! as int;
+    } on PlatformException catch (e) {
+      throw CameraException(e.code, e.message);
+    }
+  }
+
+  @override
+  Future<void> initializeCamera(
+    int cameraId, {
+    ImageFormatGroup imageFormatGroup = ImageFormatGroup.unknown,
+  }) async {
+    final int requestedCameraId = cameraId;
+
+    /// Creates channel for camera events.
+    _cameraChannels.putIfAbsent(requestedCameraId, () {
+      final MethodChannel channel = MethodChannel(
+          'plugins.flutter.io/camera_windows/camera$requestedCameraId');
+      channel.setMethodCallHandler(
+        (MethodCall call) => handleCameraMethodCall(call, requestedCameraId),
+      );
+      return channel;
+    });
+
+    final Map<String, double>? reply;
+    try {
+      reply = await pluginChannel.invokeMapMethod<String, double>(
+        'initialize',
+        <String, dynamic>{
+          'cameraId': requestedCameraId,
+        },
+      );
+    } on PlatformException catch (e) {
+      throw CameraException(e.code, e.message);
+    }
+
+    cameraEventStreamController.add(
+      CameraInitializedEvent(
+        requestedCameraId,
+        reply!['previewWidth']!,
+        reply['previewHeight']!,
+        ExposureMode.auto,
+        false,
+        FocusMode.auto,
+        false,
+      ),
+    );
+  }
+
+  @override
+  Future<void> dispose(int cameraId) async {
+    await pluginChannel.invokeMethod<void>(
+      'dispose',
+      <String, dynamic>{'cameraId': cameraId},
+    );
+
+    // Destroy method channel after camera is disposed to be able to handle last messages.
+    if (_cameraChannels.containsKey(cameraId)) {
+      final MethodChannel? cameraChannel = _cameraChannels[cameraId];
+      cameraChannel?.setMethodCallHandler(null);
+      _cameraChannels.remove(cameraId);
+    }
+  }
+
+  @override
+  Stream<CameraInitializedEvent> onCameraInitialized(int cameraId) {
+    return _cameraEvents(cameraId).whereType<CameraInitializedEvent>();
+  }
+
+  @override
+  Stream<CameraResolutionChangedEvent> onCameraResolutionChanged(int cameraId) {
+    /// Windows API does not automatically change the camera's resolution
+    /// during capture so these events are never send from the platform.
+    /// Support for changing resolution should be implemented, if support for
+    /// requesting resolution change is added to camera platform interface.
+    return const Stream<CameraResolutionChangedEvent>.empty();
+  }
+
+  @override
+  Stream<CameraClosingEvent> onCameraClosing(int cameraId) {
+    return _cameraEvents(cameraId).whereType<CameraClosingEvent>();
+  }
+
+  @override
+  Stream<CameraErrorEvent> onCameraError(int cameraId) {
+    return _cameraEvents(cameraId).whereType<CameraErrorEvent>();
+  }
+
+  @override
+  Stream<VideoRecordedEvent> onVideoRecordedEvent(int cameraId) {
+    return _cameraEvents(cameraId).whereType<VideoRecordedEvent>();
+  }
+
+  @override
+  Stream<DeviceOrientationChangedEvent> onDeviceOrientationChanged() {
+    // TODO(jokerttu): Implement device orientation detection, https://github.com/flutter/flutter/issues/97540.
+    // Force device orientation to landscape as by default camera plugin uses portraitUp orientation.
+    return Stream<DeviceOrientationChangedEvent>.value(
+      const DeviceOrientationChangedEvent(DeviceOrientation.landscapeRight),
+    );
+  }
+
+  @override
+  Future<void> lockCaptureOrientation(
+    int cameraId,
+    DeviceOrientation orientation,
+  ) async {
+    // TODO(jokerttu): Implement lock capture orientation feature, https://github.com/flutter/flutter/issues/97540.
+    throw UnimplementedError('lockCaptureOrientation() is not implemented.');
+  }
+
+  @override
+  Future<void> unlockCaptureOrientation(int cameraId) async {
+    // TODO(jokerttu): Implement unlock capture orientation feature, https://github.com/flutter/flutter/issues/97540.
+    throw UnimplementedError('unlockCaptureOrientation() is not implemented.');
+  }
+
+  @override
+  Future<XFile> takePicture(int cameraId) async {
+    final String? path;
+    path = await pluginChannel.invokeMethod<String>(
+      'takePicture',
+      <String, dynamic>{'cameraId': cameraId},
+    );
+
+    return XFile(path!);
+  }
+
+  @override
+  Future<void> prepareForVideoRecording() =>
+      pluginChannel.invokeMethod<void>('prepareForVideoRecording');
+
+  @override
+  Future<void> startVideoRecording(
+    int cameraId, {
+    Duration? maxVideoDuration,
+  }) async {
+    await pluginChannel.invokeMethod<void>(
+      'startVideoRecording',
+      <String, dynamic>{
+        'cameraId': cameraId,
+        'maxVideoDuration': maxVideoDuration?.inMilliseconds,
+      },
+    );
+  }
+
+  @override
+  Future<XFile> stopVideoRecording(int cameraId) async {
+    final String? path;
+
+    path = await pluginChannel.invokeMethod<String>(
+      'stopVideoRecording',
+      <String, dynamic>{'cameraId': cameraId},
+    );
+
+    return XFile(path!);
+  }
+
+  @override
+  Future<void> pauseVideoRecording(int cameraId) async {
+    throw UnsupportedError(
+        'pauseVideoRecording() is not supported due to Win32 API limitations.');
+  }
+
+  @override
+  Future<void> resumeVideoRecording(int cameraId) async {
+    throw UnsupportedError(
+        'resumeVideoRecording() is not supported due to Win32 API limitations.');
+  }
+
+  @override
+  Future<void> setFlashMode(int cameraId, FlashMode mode) async {
+    // TODO(jokerttu): Implement flash mode support, https://github.com/flutter/flutter/issues/97537.
+    throw UnimplementedError('setFlashMode() is not implemented.');
+  }
+
+  @override
+  Future<void> setExposureMode(int cameraId, ExposureMode mode) async {
+    // TODO(jokerttu): Implement explosure mode support, https://github.com/flutter/flutter/issues/97537.
+    throw UnimplementedError('setExposureMode() is not implemented.');
+  }
+
+  @override
+  Future<void> setExposurePoint(int cameraId, Point<double>? point) async {
+    assert(point == null || point.x >= 0 && point.x <= 1);
+    assert(point == null || point.y >= 0 && point.y <= 1);
+
+    throw UnsupportedError(
+        'setExposurePoint() is not supported due to Win32 API limitations.');
+  }
+
+  @override
+  Future<double> getMinExposureOffset(int cameraId) async {
+    // TODO(jokerttu): Implement exposure control support, https://github.com/flutter/flutter/issues/97537.
+    // Value is returned to support existing implementations.
+    return 0.0;
+  }
+
+  @override
+  Future<double> getMaxExposureOffset(int cameraId) async {
+    // TODO(jokerttu): Implement exposure control support, https://github.com/flutter/flutter/issues/97537.
+    // Value is returned to support existing implementations.
+    return 0.0;
+  }
+
+  @override
+  Future<double> getExposureOffsetStepSize(int cameraId) async {
+    // TODO(jokerttu): Implement exposure control support, https://github.com/flutter/flutter/issues/97537.
+    // Value is returned to support existing implementations.
+    return 1.0;
+  }
+
+  @override
+  Future<double> setExposureOffset(int cameraId, double offset) async {
+    // TODO(jokerttu): Implement exposure control support, https://github.com/flutter/flutter/issues/97537.
+    throw UnimplementedError('setExposureOffset() is not implemented.');
+  }
+
+  @override
+  Future<void> setFocusMode(int cameraId, FocusMode mode) async {
+    // TODO(jokerttu): Implement focus mode support, https://github.com/flutter/flutter/issues/97537.
+    throw UnimplementedError('setFocusMode() is not implemented.');
+  }
+
+  @override
+  Future<void> setFocusPoint(int cameraId, Point<double>? point) async {
+    assert(point == null || point.x >= 0 && point.x <= 1);
+    assert(point == null || point.y >= 0 && point.y <= 1);
+
+    throw UnsupportedError(
+        'setFocusPoint() is not supported due to Win32 API limitations.');
+  }
+
+  @override
+  Future<double> getMinZoomLevel(int cameraId) async {
+    // TODO(jokerttu): Implement zoom level support, https://github.com/flutter/flutter/issues/97537.
+    // Value is returned to support existing implementations.
+    return 1.0;
+  }
+
+  @override
+  Future<double> getMaxZoomLevel(int cameraId) async {
+    // TODO(jokerttu): Implement zoom level support, https://github.com/flutter/flutter/issues/97537.
+    // Value is returned to support existing implementations.
+    return 1.0;
+  }
+
+  @override
+  Future<void> setZoomLevel(int cameraId, double zoom) async {
+    // TODO(jokerttu): Implement zoom level support, https://github.com/flutter/flutter/issues/97537.
+    throw UnimplementedError('setZoomLevel() is not implemented.');
+  }
+
+  @override
+  Future<void> pausePreview(int cameraId) async {
+    await pluginChannel.invokeMethod<double>(
+      'pausePreview',
+      <String, dynamic>{'cameraId': cameraId},
+    );
+  }
+
+  @override
+  Future<void> resumePreview(int cameraId) async {
+    await pluginChannel.invokeMethod<double>(
+      'resumePreview',
+      <String, dynamic>{'cameraId': cameraId},
+    );
+  }
+
+  @override
+  Widget buildPreview(int cameraId) {
+    return Texture(textureId: cameraId);
+  }
+
+  /// Returns the resolution preset as a nullable String.
+  String? _serializeResolutionPreset(ResolutionPreset? resolutionPreset) {
+    switch (resolutionPreset) {
+      case null:
+        return null;
+      case ResolutionPreset.max:
+        return 'max';
+      case ResolutionPreset.ultraHigh:
+        return 'ultraHigh';
+      case ResolutionPreset.veryHigh:
+        return 'veryHigh';
+      case ResolutionPreset.high:
+        return 'high';
+      case ResolutionPreset.medium:
+        return 'medium';
+      case ResolutionPreset.low:
+        return 'low';
+    }
+  }
+
+  /// Converts messages received from the native platform into camera events.
+  ///
+  /// This is only exposed for test purposes. It shouldn't be used by clients
+  /// of the plugin as it may break or change at any time.
+  @visibleForTesting
+  Future<dynamic> handleCameraMethodCall(MethodCall call, int cameraId) async {
+    switch (call.method) {
+      case 'camera_closing':
+        cameraEventStreamController.add(
+          CameraClosingEvent(
+            cameraId,
+          ),
+        );
+        break;
+      case 'video_recorded':
+        // This is called if maxVideoDuration was given on record start.
+        cameraEventStreamController.add(
+          VideoRecordedEvent(
+            cameraId,
+            XFile(call.arguments['path'] as String),
+            call.arguments['maxVideoDuration'] != null
+                ? Duration(
+                    milliseconds: call.arguments['maxVideoDuration'] as int,
+                  )
+                : null,
+          ),
+        );
+        break;
+      case 'error':
+        cameraEventStreamController.add(
+          CameraErrorEvent(
+            cameraId,
+            call.arguments['description'] as String,
+          ),
+        );
+        break;
+      default:
+        throw UnimplementedError();
+    }
+  }
+
+  /// Parses string presentation of the camera lens direction and returns enum value.
+  @visibleForTesting
+  CameraLensDirection parseCameraLensDirection(String string) {
+    switch (string) {
+      case 'front':
+        return CameraLensDirection.front;
+      case 'back':
+        return CameraLensDirection.back;
+      case 'external':
+        return CameraLensDirection.external;
+    }
+    throw ArgumentError('Unknown CameraLensDirection value');
+  }
+}
diff --git a/packages/camera/camera_windows/pubspec.yaml b/packages/camera/camera_windows/pubspec.yaml
new file mode 100644
index 0000000..1081c3d
--- /dev/null
+++ b/packages/camera/camera_windows/pubspec.yaml
@@ -0,0 +1,29 @@
+name: camera_windows
+description: A Flutter plugin for getting information about and controlling the camera on Windows.
+version: 0.1.0
+repository: https://github.com/flutter/plugins/tree/master/packages/camera/camera_windows
+issue_tracker: https://github.com/flutter/flutter/issues?q=is%3Aissue+is%3Aopen+label%3A%22p%3A+camera%22
+
+environment:
+  sdk: ">=2.12.0 <3.0.0"
+  flutter: ">=2.8.0"
+
+flutter:
+  plugin:
+    implements: camera
+    platforms:
+      windows:
+        pluginClass: CameraWindows
+        dartPluginClass: CameraWindows
+
+dependencies:
+  camera_platform_interface: ^2.1.2
+  cross_file: ^0.3.1
+  flutter:
+    sdk: flutter
+  stream_transform: ^2.0.0
+
+dev_dependencies:
+  async: ^2.5.0
+  flutter_test:
+    sdk: flutter
diff --git a/packages/camera/camera_windows/test/camera_windows_test.dart b/packages/camera/camera_windows/test/camera_windows_test.dart
new file mode 100644
index 0000000..c1a0fe4
--- /dev/null
+++ b/packages/camera/camera_windows/test/camera_windows_test.dart
@@ -0,0 +1,664 @@
+// Copyright 2013 The Flutter Authors. All rights reserved.
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+import 'package:async/async.dart';
+import 'package:camera_platform_interface/camera_platform_interface.dart';
+import 'package:camera_windows/camera_windows.dart';
+import 'package:flutter/services.dart';
+import 'package:flutter/widgets.dart';
+import 'package:flutter_test/flutter_test.dart';
+import './utils/method_channel_mock.dart';
+
+void main() {
+  const String pluginChannelName = 'plugins.flutter.io/camera_windows';
+  TestWidgetsFlutterBinding.ensureInitialized();
+
+  group('$CameraWindows()', () {
+    test('registered instance', () {
+      CameraWindows.registerWith();
+      expect(CameraPlatform.instance, isA<CameraWindows>());
+    });
+
+    group('Creation, Initialization & Disposal Tests', () {
+      test('Should send creation data and receive back a camera id', () async {
+        // Arrange
+        final MethodChannelMock cameraMockChannel = MethodChannelMock(
+            channelName: pluginChannelName,
+            methods: <String, dynamic>{
+              'create': <String, dynamic>{
+                'cameraId': 1,
+                'imageFormatGroup': 'unknown',
+              }
+            });
+        final CameraWindows plugin = CameraWindows();
+
+        // Act
+        final int cameraId = await plugin.createCamera(
+          const CameraDescription(
+              name: 'Test',
+              lensDirection: CameraLensDirection.front,
+              sensorOrientation: 0),
+          ResolutionPreset.high,
+        );
+
+        // Assert
+        expect(cameraMockChannel.log, <Matcher>[
+          isMethodCall(
+            'create',
+            arguments: <String, Object?>{
+              'cameraName': 'Test',
+              'resolutionPreset': 'high',
+              'enableAudio': false
+            },
+          ),
+        ]);
+        expect(cameraId, 1);
+      });
+
+      test(
+          'Should throw CameraException when create throws a PlatformException',
+          () {
+        // Arrange
+        MethodChannelMock(
+            channelName: pluginChannelName,
+            methods: <String, dynamic>{
+              'create': PlatformException(
+                code: 'TESTING_ERROR_CODE',
+                message: 'Mock error message used during testing.',
+              )
+            });
+        final CameraWindows plugin = CameraWindows();
+
+        // Act
+        expect(
+          () => plugin.createCamera(
+            const CameraDescription(
+              name: 'Test',
+              lensDirection: CameraLensDirection.back,
+              sensorOrientation: 0,
+            ),
+            ResolutionPreset.high,
+          ),
+          throwsA(
+            isA<CameraException>()
+                .having(
+                    (CameraException e) => e.code, 'code', 'TESTING_ERROR_CODE')
+                .having((CameraException e) => e.description, 'description',
+                    'Mock error message used during testing.'),
+          ),
+        );
+      });
+
+      test(
+        'Should throw CameraException when initialize throws a PlatformException',
+        () {
+          // Arrange
+          MethodChannelMock(
+            channelName: pluginChannelName,
+            methods: <String, dynamic>{
+              'initialize': PlatformException(
+                code: 'TESTING_ERROR_CODE',
+                message: 'Mock error message used during testing.',
+              )
+            },
+          );
+          final CameraWindows plugin = CameraWindows();
+
+          // Act
+          expect(
+            () => plugin.initializeCamera(0),
+            throwsA(
+              isA<CameraException>()
+                  .having((CameraException e) => e.code, 'code',
+                      'TESTING_ERROR_CODE')
+                  .having(
+                    (CameraException e) => e.description,
+                    'description',
+                    'Mock error message used during testing.',
+                  ),
+            ),
+          );
+        },
+      );
+
+      test('Should send initialization data', () async {
+        // Arrange
+        final MethodChannelMock cameraMockChannel = MethodChannelMock(
+            channelName: pluginChannelName,
+            methods: <String, dynamic>{
+              'create': <String, dynamic>{
+                'cameraId': 1,
+                'imageFormatGroup': 'unknown',
+              },
+              'initialize': <String, dynamic>{
+                'previewWidth': 1920.toDouble(),
+                'previewHeight': 1080.toDouble()
+              },
+            });
+        final CameraWindows plugin = CameraWindows();
+        final int cameraId = await plugin.createCamera(
+          const CameraDescription(
+            name: 'Test',
+            lensDirection: CameraLensDirection.back,
+            sensorOrientation: 0,
+          ),
+          ResolutionPreset.high,
+        );
+
+        // Act
+        await plugin.initializeCamera(cameraId);
+
+        // Assert
+        expect(cameraId, 1);
+        expect(cameraMockChannel.log, <Matcher>[
+          anything,
+          isMethodCall(
+            'initialize',
+            arguments: <String, Object?>{'cameraId': 1},
+          ),
+        ]);
+      });
+
+      test('Should send a disposal call on dispose', () async {
+        // Arrange
+        final MethodChannelMock cameraMockChannel = MethodChannelMock(
+            channelName: pluginChannelName,
+            methods: <String, dynamic>{
+              'create': <String, dynamic>{'cameraId': 1},
+              'initialize': <String, dynamic>{
+                'previewWidth': 1920.toDouble(),
+                'previewHeight': 1080.toDouble()
+              },
+              'dispose': <String, dynamic>{'cameraId': 1}
+            });
+
+        final CameraWindows plugin = CameraWindows();
+        final int cameraId = await plugin.createCamera(
+          const CameraDescription(
+            name: 'Test',
+            lensDirection: CameraLensDirection.back,
+            sensorOrientation: 0,
+          ),
+          ResolutionPreset.high,
+        );
+        await plugin.initializeCamera(cameraId);
+
+        // Act
+        await plugin.dispose(cameraId);
+
+        // Assert
+        expect(cameraId, 1);
+        expect(cameraMockChannel.log, <Matcher>[
+          anything,
+          anything,
+          isMethodCall(
+            'dispose',
+            arguments: <String, Object?>{'cameraId': 1},
+          ),
+        ]);
+      });
+    });
+
+    group('Event Tests', () {
+      late CameraWindows plugin;
+      late int cameraId;
+      setUp(() async {
+        MethodChannelMock(
+          channelName: pluginChannelName,
+          methods: <String, dynamic>{
+            'create': <String, dynamic>{'cameraId': 1},
+            'initialize': <String, dynamic>{
+              'previewWidth': 1920.toDouble(),
+              'previewHeight': 1080.toDouble()
+            },
+          },
+        );
+
+        plugin = CameraWindows();
+        cameraId = await plugin.createCamera(
+          const CameraDescription(
+            name: 'Test',
+            lensDirection: CameraLensDirection.back,
+            sensorOrientation: 0,
+          ),
+          ResolutionPreset.high,
+        );
+        await plugin.initializeCamera(cameraId);
+      });
+
+      test('Should receive camera closing events', () async {
+        // Act
+        final Stream<CameraClosingEvent> eventStream =
+            plugin.onCameraClosing(cameraId);
+        final StreamQueue<CameraClosingEvent> streamQueue =
+            StreamQueue<CameraClosingEvent>(eventStream);
+
+        // Emit test events
+        final CameraClosingEvent event = CameraClosingEvent(cameraId);
+        await plugin.handleCameraMethodCall(
+            MethodCall('camera_closing', event.toJson()), cameraId);
+        await plugin.handleCameraMethodCall(
+            MethodCall('camera_closing', event.toJson()), cameraId);
+        await plugin.handleCameraMethodCall(
+            MethodCall('camera_closing', event.toJson()), cameraId);
+
+        // Assert
+        expect(await streamQueue.next, event);
+        expect(await streamQueue.next, event);
+        expect(await streamQueue.next, event);
+
+        // Clean up
+        await streamQueue.cancel();
+      });
+
+      test('Should receive camera error events', () async {
+        // Act
+        final Stream<CameraErrorEvent> errorStream =
+            plugin.onCameraError(cameraId);
+        final StreamQueue<CameraErrorEvent> streamQueue =
+            StreamQueue<CameraErrorEvent>(errorStream);
+
+        // Emit test events
+        final CameraErrorEvent event =
+            CameraErrorEvent(cameraId, 'Error Description');
+        await plugin.handleCameraMethodCall(
+            MethodCall('error', event.toJson()), cameraId);
+        await plugin.handleCameraMethodCall(
+            MethodCall('error', event.toJson()), cameraId);
+        await plugin.handleCameraMethodCall(
+            MethodCall('error', event.toJson()), cameraId);
+
+        // Assert
+        expect(await streamQueue.next, event);
+        expect(await streamQueue.next, event);
+        expect(await streamQueue.next, event);
+
+        // Clean up
+        await streamQueue.cancel();
+      });
+    });
+
+    group('Function Tests', () {
+      late CameraWindows plugin;
+      late int cameraId;
+
+      setUp(() async {
+        MethodChannelMock(
+          channelName: pluginChannelName,
+          methods: <String, dynamic>{
+            'create': <String, dynamic>{'cameraId': 1},
+            'initialize': <String, dynamic>{
+              'previewWidth': 1920.toDouble(),
+              'previewHeight': 1080.toDouble()
+            },
+          },
+        );
+        plugin = CameraWindows();
+        cameraId = await plugin.createCamera(
+          const CameraDescription(
+            name: 'Test',
+            lensDirection: CameraLensDirection.back,
+            sensorOrientation: 0,
+          ),
+          ResolutionPreset.high,
+        );
+        await plugin.initializeCamera(cameraId);
+      });
+
+      test('Should fetch CameraDescription instances for available cameras',
+          () async {
+        // Arrange
+        final List<dynamic> returnData = <dynamic>[
+          <String, dynamic>{
+            'name': 'Test 1',
+            'lensFacing': 'front',
+            'sensorOrientation': 1
+          },
+          <String, dynamic>{
+            'name': 'Test 2',
+            'lensFacing': 'back',
+            'sensorOrientation': 2
+          }
+        ];
+        final MethodChannelMock channel = MethodChannelMock(
+          channelName: pluginChannelName,
+          methods: <String, dynamic>{'availableCameras': returnData},
+        );
+
+        // Act
+        final List<CameraDescription> cameras = await plugin.availableCameras();
+
+        // Assert
+        expect(channel.log, <Matcher>[
+          isMethodCall('availableCameras', arguments: null),
+        ]);
+        expect(cameras.length, returnData.length);
+        for (int i = 0; i < returnData.length; i++) {
+          final CameraDescription cameraDescription = CameraDescription(
+            name: returnData[i]['name']! as String,
+            lensDirection: plugin.parseCameraLensDirection(
+                returnData[i]['lensFacing']! as String),
+            sensorOrientation: returnData[i]['sensorOrientation']! as int,
+          );
+          expect(cameras[i], cameraDescription);
+        }
+      });
+
+      test(
+          'Should throw CameraException when availableCameras throws a PlatformException',
+          () {
+        // Arrange
+        MethodChannelMock(
+            channelName: pluginChannelName,
+            methods: <String, dynamic>{
+              'availableCameras': PlatformException(
+                code: 'TESTING_ERROR_CODE',
+                message: 'Mock error message used during testing.',
+              )
+            });
+
+        // Act
+        expect(
+          plugin.availableCameras,
+          throwsA(
+            isA<CameraException>()
+                .having(
+                    (CameraException e) => e.code, 'code', 'TESTING_ERROR_CODE')
+                .having((CameraException e) => e.description, 'description',
+                    'Mock error message used during testing.'),
+          ),
+        );
+      });
+
+      test('Should take a picture and return an XFile instance', () async {
+        // Arrange
+        final MethodChannelMock channel = MethodChannelMock(
+            channelName: pluginChannelName,
+            methods: <String, dynamic>{'takePicture': '/test/path.jpg'});
+
+        // Act
+        final XFile file = await plugin.takePicture(cameraId);
+
+        // Assert
+        expect(channel.log, <Matcher>[
+          isMethodCall('takePicture', arguments: <String, Object?>{
+            'cameraId': cameraId,
+          }),
+        ]);
+        expect(file.path, '/test/path.jpg');
+      });
+
+      test('Should prepare for video recording', () async {
+        // Arrange
+        final MethodChannelMock channel = MethodChannelMock(
+          channelName: pluginChannelName,
+          methods: <String, dynamic>{'prepareForVideoRecording': null},
+        );
+
+        // Act
+        await plugin.prepareForVideoRecording();
+
+        // Assert
+        expect(channel.log, <Matcher>[
+          isMethodCall('prepareForVideoRecording', arguments: null),
+        ]);
+      });
+
+      test('Should start recording a video', () async {
+        // Arrange
+        final MethodChannelMock channel = MethodChannelMock(
+          channelName: pluginChannelName,
+          methods: <String, dynamic>{'startVideoRecording': null},
+        );
+
+        // Act
+        await plugin.startVideoRecording(cameraId);
+
+        // Assert
+        expect(channel.log, <Matcher>[
+          isMethodCall('startVideoRecording', arguments: <String, Object?>{
+            'cameraId': cameraId,
+            'maxVideoDuration': null,
+          }),
+        ]);
+      });
+
+      test('Should pass maxVideoDuration when starting recording a video',
+          () async {
+        // Arrange
+        final MethodChannelMock channel = MethodChannelMock(
+          channelName: pluginChannelName,
+          methods: <String, dynamic>{'startVideoRecording': null},
+        );
+
+        // Act
+        await plugin.startVideoRecording(
+          cameraId,
+          maxVideoDuration: const Duration(seconds: 10),
+        );
+
+        // Assert
+        expect(channel.log, <Matcher>[
+          isMethodCall('startVideoRecording', arguments: <String, Object?>{
+            'cameraId': cameraId,
+            'maxVideoDuration': 10000
+          }),
+        ]);
+      });
+
+      test('Should stop a video recording and return the file', () async {
+        // Arrange
+        final MethodChannelMock channel = MethodChannelMock(
+          channelName: pluginChannelName,
+          methods: <String, dynamic>{'stopVideoRecording': '/test/path.mp4'},
+        );
+
+        // Act
+        final XFile file = await plugin.stopVideoRecording(cameraId);
+
+        // Assert
+        expect(channel.log, <Matcher>[
+          isMethodCall('stopVideoRecording', arguments: <String, Object?>{
+            'cameraId': cameraId,
+          }),
+        ]);
+        expect(file.path, '/test/path.mp4');
+      });
+
+      test('Should throw UnsupportedError when pause video recording is called',
+          () async {
+        // Act
+        expect(
+          () => plugin.pauseVideoRecording(cameraId),
+          throwsA(isA<UnsupportedError>()),
+        );
+      });
+
+      test(
+          'Should throw UnsupportedError when resume video recording is called',
+          () async {
+        // Act
+        expect(
+          () => plugin.resumeVideoRecording(cameraId),
+          throwsA(isA<UnsupportedError>()),
+        );
+      });
+
+      test('Should throw UnimplementedError when flash mode is set', () async {
+        // Act
+        expect(
+          () => plugin.setFlashMode(cameraId, FlashMode.torch),
+          throwsA(isA<UnimplementedError>()),
+        );
+      });
+
+      test('Should throw UnimplementedError when exposure mode is set',
+          () async {
+        // Act
+        expect(
+          () => plugin.setExposureMode(cameraId, ExposureMode.auto),
+          throwsA(isA<UnimplementedError>()),
+        );
+      });
+
+      test('Should throw UnsupportedError when exposure point is set',
+          () async {
+        // Act
+        expect(
+          () => plugin.setExposurePoint(cameraId, null),
+          throwsA(isA<UnsupportedError>()),
+        );
+      });
+
+      test('Should get the min exposure offset', () async {
+        // Act
+        final double minExposureOffset =
+            await plugin.getMinExposureOffset(cameraId);
+
+        // Assert
+        expect(minExposureOffset, 0.0);
+      });
+
+      test('Should get the max exposure offset', () async {
+        // Act
+        final double maxExposureOffset =
+            await plugin.getMaxExposureOffset(cameraId);
+
+        // Assert
+        expect(maxExposureOffset, 0.0);
+      });
+
+      test('Should get the exposure offset step size', () async {
+        // Act
+        final double stepSize =
+            await plugin.getExposureOffsetStepSize(cameraId);
+
+        // Assert
+        expect(stepSize, 1.0);
+      });
+
+      test('Should throw UnimplementedError when exposure offset is set',
+          () async {
+        // Act
+        expect(
+          () => plugin.setExposureOffset(cameraId, 0.5),
+          throwsA(isA<UnimplementedError>()),
+        );
+      });
+
+      test('Should throw UnimplementedError when focus mode is set', () async {
+        // Act
+        expect(
+          () => plugin.setFocusMode(cameraId, FocusMode.auto),
+          throwsA(isA<UnimplementedError>()),
+        );
+      });
+
+      test('Should throw UnsupportedError when exposure point is set',
+          () async {
+        // Act
+        expect(
+          () => plugin.setFocusMode(cameraId, FocusMode.auto),
+          throwsA(isA<UnsupportedError>()),
+        );
+      });
+
+      test('Should build a texture widget as preview widget', () async {
+        // Act
+        final Widget widget = plugin.buildPreview(cameraId);
+
+        // Act
+        expect(widget is Texture, isTrue);
+        expect((widget as Texture).textureId, cameraId);
+      });
+
+      test('Should throw UnimplementedError when handling unknown method', () {
+        final CameraWindows plugin = CameraWindows();
+
+        expect(
+            () => plugin.handleCameraMethodCall(
+                const MethodCall('unknown_method'), 1),
+            throwsA(isA<UnimplementedError>()));
+      });
+
+      test('Should get the max zoom level', () async {
+        // Act
+        final double maxZoomLevel = await plugin.getMaxZoomLevel(cameraId);
+
+        // Assert
+        expect(maxZoomLevel, 1.0);
+      });
+
+      test('Should get the min zoom level', () async {
+        // Act
+        final double maxZoomLevel = await plugin.getMinZoomLevel(cameraId);
+
+        // Assert
+        expect(maxZoomLevel, 1.0);
+      });
+
+      test('Should throw UnimplementedError when zoom level is set', () async {
+        // Act
+        expect(
+          () => plugin.setZoomLevel(cameraId, 2.0),
+          throwsA(isA<UnimplementedError>()),
+        );
+      });
+
+      test(
+          'Should throw UnimplementedError when lock capture orientation is called',
+          () async {
+        // Act
+        expect(
+          () => plugin.setZoomLevel(cameraId, 2.0),
+          throwsA(isA<UnimplementedError>()),
+        );
+      });
+
+      test(
+          'Should throw UnimplementedError when unlock capture orientation is called',
+          () async {
+        // Act
+        expect(
+          () => plugin.unlockCaptureOrientation(cameraId),
+          throwsA(isA<UnimplementedError>()),
+        );
+      });
+
+      test('Should pause the camera preview', () async {
+        // Arrange
+        final MethodChannelMock channel = MethodChannelMock(
+          channelName: pluginChannelName,
+          methods: <String, dynamic>{'pausePreview': null},
+        );
+
+        // Act
+        await plugin.pausePreview(cameraId);
+
+        // Assert
+        expect(channel.log, <Matcher>[
+          isMethodCall('pausePreview',
+              arguments: <String, Object?>{'cameraId': cameraId}),
+        ]);
+      });
+
+      test('Should resume the camera preview', () async {
+        // Arrange
+        final MethodChannelMock channel = MethodChannelMock(
+          channelName: pluginChannelName,
+          methods: <String, dynamic>{'resumePreview': null},
+        );
+
+        // Act
+        await plugin.resumePreview(cameraId);
+
+        // Assert
+        expect(channel.log, <Matcher>[
+          isMethodCall('resumePreview',
+              arguments: <String, Object?>{'cameraId': cameraId}),
+        ]);
+      });
+    });
+  });
+}
diff --git a/packages/camera/camera_windows/test/utils/method_channel_mock.dart b/packages/camera/camera_windows/test/utils/method_channel_mock.dart
new file mode 100644
index 0000000..22f7ece
--- /dev/null
+++ b/packages/camera/camera_windows/test/utils/method_channel_mock.dart
@@ -0,0 +1,45 @@
+// Copyright 2013 The Flutter Authors. All rights reserved.
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+import 'package:flutter/services.dart';
+import 'package:flutter_test/flutter_test.dart';
+
+/// A mock [MethodChannel] implementation for use in tests.
+class MethodChannelMock {
+  /// Creates a new instance with the specified channel name.
+  ///
+  /// This method channel will handle all method invocations specified by
+  /// returning the value mapped to the method name key. If a delay is
+  /// specified, results are returned after the delay has elapsed.
+  MethodChannelMock({
+    required String channelName,
+    this.delay,
+    required this.methods,
+  }) : methodChannel = MethodChannel(channelName) {
+    methodChannel.setMockMethodCallHandler(_handler);
+  }
+
+  final Duration? delay;
+  final MethodChannel methodChannel;
+  final Map<String, dynamic> methods;
+  final List<MethodCall> log = <MethodCall>[];
+
+  Future<dynamic> _handler(MethodCall methodCall) async {
+    log.add(methodCall);
+
+    if (!methods.containsKey(methodCall.method)) {
+      throw MissingPluginException('No TEST implementation found for method '
+          '${methodCall.method} on channel ${methodChannel.name}');
+    }
+
+    return Future<dynamic>.delayed(delay ?? Duration.zero, () {
+      final dynamic result = methods[methodCall.method];
+      if (result is Exception) {
+        throw result;
+      }
+
+      return Future<dynamic>.value(result);
+    });
+  }
+}
diff --git a/packages/camera/camera_windows/windows/.gitignore b/packages/camera/camera_windows/windows/.gitignore
new file mode 100644
index 0000000..b3eb2be
--- /dev/null
+++ b/packages/camera/camera_windows/windows/.gitignore
@@ -0,0 +1,17 @@
+flutter/
+
+# Visual Studio user-specific files.
+*.suo
+*.user
+*.userosscache
+*.sln.docstates
+
+# Visual Studio build-related files.
+x64/
+x86/
+
+# Visual Studio cache files
+# files ending in .cache can be ignored
+*.[Cc]ache
+# but keep track of directories ending in .cache
+!*.[Cc]ache/
diff --git a/packages/camera/camera_windows/windows/CMakeLists.txt b/packages/camera/camera_windows/windows/CMakeLists.txt
new file mode 100644
index 0000000..caeb109
--- /dev/null
+++ b/packages/camera/camera_windows/windows/CMakeLists.txt
@@ -0,0 +1,99 @@
+cmake_minimum_required(VERSION 3.14)
+set(PROJECT_NAME "camera_windows")
+project(${PROJECT_NAME} LANGUAGES CXX)
+
+# This value is used when generating builds using this plugin, so it must
+# not be changed
+set(PLUGIN_NAME "${PROJECT_NAME}_plugin")
+
+list(APPEND PLUGIN_SOURCES
+  "camera_plugin.h"
+  "camera_plugin.cpp"
+  "camera.h"
+  "camera.cpp"
+  "capture_controller.h"
+  "capture_controller.cpp"
+  "capture_controller_listener.h"
+  "capture_engine_listener.h"
+  "capture_engine_listener.cpp"
+  "string_utils.h"
+  "string_utils.cpp"
+  "capture_device_info.h"
+  "capture_device_info.cpp"
+  "preview_handler.h"
+  "preview_handler.cpp"
+  "record_handler.h"
+  "record_handler.cpp"
+  "photo_handler.h"
+  "photo_handler.cpp"
+  "texture_handler.h"
+  "texture_handler.cpp"
+  "com_heap_ptr.h"
+)
+
+add_library(${PLUGIN_NAME} SHARED
+  "camera_windows.cpp"
+  "include/camera_windows/camera_windows.h"
+  ${PLUGIN_SOURCES}
+)
+
+apply_standard_settings(${PLUGIN_NAME})
+set_target_properties(${PLUGIN_NAME} PROPERTIES
+  CXX_VISIBILITY_PRESET hidden)
+target_compile_definitions(${PLUGIN_NAME} PRIVATE FLUTTER_PLUGIN_IMPL)
+target_include_directories(${PLUGIN_NAME} INTERFACE
+  "${CMAKE_CURRENT_SOURCE_DIR}/include")
+target_link_libraries(${PLUGIN_NAME} PRIVATE flutter flutter_wrapper_plugin)
+target_link_libraries(${PLUGIN_NAME} PRIVATE mf mfplat mfuuid d3d11)
+
+# List of absolute paths to libraries that should be bundled with the plugin
+set(camera_windows_bundled_libraries
+  ""
+  PARENT_SCOPE
+)
+
+
+# === Tests ===
+
+if (${include_${PROJECT_NAME}_tests})
+set(TEST_RUNNER "${PROJECT_NAME}_test")
+enable_testing()
+# TODO(stuartmorgan): Consider using a single shared, pre-checked-in googletest
+# instance rather than downloading for each plugin. This approach makes sense
+# for a template, but not for a monorepo with many plugins.
+include(FetchContent)
+FetchContent_Declare(
+  googletest
+  URL https://github.com/google/googletest/archive/release-1.11.0.zip
+)
+# Prevent overriding the parent project's compiler/linker settings
+set(gtest_force_shared_crt ON CACHE BOOL "" FORCE)
+# Disable install commands for gtest so it doesn't end up in the bundle.
+set(INSTALL_GTEST OFF CACHE BOOL "Disable installation of googletest" FORCE)
+
+FetchContent_MakeAvailable(googletest)
+
+# The plugin's C API is not very useful for unit testing, so build the sources
+# directly into the test binary rather than using the DLL.
+add_executable(${TEST_RUNNER}
+  test/mocks.h
+  test/camera_plugin_test.cpp
+  test/camera_test.cpp
+  test/capture_controller_test.cpp
+  ${PLUGIN_SOURCES}
+)
+apply_standard_settings(${TEST_RUNNER})
+target_include_directories(${TEST_RUNNER} PRIVATE "${CMAKE_CURRENT_SOURCE_DIR}")
+target_link_libraries(${TEST_RUNNER} PRIVATE flutter_wrapper_plugin)
+target_link_libraries(${TEST_RUNNER} PRIVATE mf mfplat mfuuid d3d11)
+target_link_libraries(${TEST_RUNNER} PRIVATE gtest_main gmock)
+
+# flutter_wrapper_plugin has link dependencies on the Flutter DLL.
+add_custom_command(TARGET ${TEST_RUNNER} POST_BUILD
+  COMMAND ${CMAKE_COMMAND} -E copy_if_different
+  "${FLUTTER_LIBRARY}" $<TARGET_FILE_DIR:${TEST_RUNNER}>
+)
+
+include(GoogleTest)
+gtest_discover_tests(${TEST_RUNNER})
+endif()
diff --git a/packages/camera/camera_windows/windows/camera.cpp b/packages/camera/camera_windows/windows/camera.cpp
new file mode 100644
index 0000000..c21f8ab
--- /dev/null
+++ b/packages/camera/camera_windows/windows/camera.cpp
@@ -0,0 +1,264 @@
+// Copyright 2013 The Flutter Authors. All rights reserved.
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+#include "camera.h"
+
+namespace camera_windows {
+using flutter::EncodableList;
+using flutter::EncodableMap;
+using flutter::EncodableValue;
+
+// Camera channel events.
+constexpr char kCameraMethodChannelBaseName[] =
+    "plugins.flutter.io/camera_windows/camera";
+constexpr char kVideoRecordedEvent[] = "video_recorded";
+constexpr char kCameraClosingEvent[] = "camera_closing";
+constexpr char kErrorEvent[] = "error";
+
+CameraImpl::CameraImpl(const std::string& device_id)
+    : device_id_(device_id), Camera(device_id) {}
+
+CameraImpl::~CameraImpl() {
+  // Sends camera closing event.
+  OnCameraClosing();
+
+  capture_controller_ = nullptr;
+  SendErrorForPendingResults("plugin_disposed",
+                             "Plugin disposed before request was handled");
+}
+
+void CameraImpl::InitCamera(flutter::TextureRegistrar* texture_registrar,
+                            flutter::BinaryMessenger* messenger,
+                            bool record_audio,
+                            ResolutionPreset resolution_preset) {
+  auto capture_controller_factory =
+      std::make_unique<CaptureControllerFactoryImpl>();
+  InitCamera(std::move(capture_controller_factory), texture_registrar,
+             messenger, record_audio, resolution_preset);
+}
+
+void CameraImpl::InitCamera(
+    std::unique_ptr<CaptureControllerFactory> capture_controller_factory,
+    flutter::TextureRegistrar* texture_registrar,
+    flutter::BinaryMessenger* messenger, bool record_audio,
+    ResolutionPreset resolution_preset) {
+  assert(!device_id_.empty());
+  messenger_ = messenger;
+  capture_controller_ =
+      capture_controller_factory->CreateCaptureController(this);
+  capture_controller_->InitCaptureDevice(texture_registrar, device_id_,
+                                         record_audio, resolution_preset);
+}
+
+bool CameraImpl::AddPendingResult(
+    PendingResultType type, std::unique_ptr<flutter::MethodResult<>> result) {
+  assert(result);
+
+  auto it = pending_results_.find(type);
+  if (it != pending_results_.end()) {
+    result->Error("Duplicate request", "Method handler already called");
+    return false;
+  }
+
+  pending_results_.insert(std::make_pair(type, std::move(result)));
+  return true;
+}
+
+std::unique_ptr<flutter::MethodResult<>> CameraImpl::GetPendingResultByType(
+    PendingResultType type) {
+  auto it = pending_results_.find(type);
+  if (it == pending_results_.end()) {
+    return nullptr;
+  }
+  auto result = std::move(it->second);
+  pending_results_.erase(it);
+  return result;
+}
+
+bool CameraImpl::HasPendingResultByType(PendingResultType type) const {
+  auto it = pending_results_.find(type);
+  if (it == pending_results_.end()) {
+    return false;
+  }
+  return it->second != nullptr;
+}
+
+void CameraImpl::SendErrorForPendingResults(const std::string& error_code,
+                                            const std::string& descripion) {
+  for (const auto& pending_result : pending_results_) {
+    pending_result.second->Error(error_code, descripion);
+  }
+  pending_results_.clear();
+}
+
+MethodChannel<>* CameraImpl::GetMethodChannel() {
+  assert(messenger_);
+  assert(camera_id_);
+
+  // Use existing channel if initialized
+  if (camera_channel_) {
+    return camera_channel_.get();
+  }
+
+  auto channel_name =
+      std::string(kCameraMethodChannelBaseName) + std::to_string(camera_id_);
+
+  camera_channel_ = std::make_unique<flutter::MethodChannel<>>(
+      messenger_, channel_name, &flutter::StandardMethodCodec::GetInstance());
+
+  return camera_channel_.get();
+}
+
+void CameraImpl::OnCreateCaptureEngineSucceeded(int64_t texture_id) {
+  // Use texture id as camera id
+  camera_id_ = texture_id;
+  auto pending_result =
+      GetPendingResultByType(PendingResultType::kCreateCamera);
+  if (pending_result) {
+    pending_result->Success(EncodableMap(
+        {{EncodableValue("cameraId"), EncodableValue(texture_id)}}));
+  }
+}
+
+void CameraImpl::OnCreateCaptureEngineFailed(const std::string& error) {
+  auto pending_result =
+      GetPendingResultByType(PendingResultType::kCreateCamera);
+  if (pending_result) {
+    pending_result->Error("camera_error", error);
+  }
+}
+
+void CameraImpl::OnStartPreviewSucceeded(int32_t width, int32_t height) {
+  auto pending_result = GetPendingResultByType(PendingResultType::kInitialize);
+  if (pending_result) {
+    pending_result->Success(EncodableValue(EncodableMap({
+        {EncodableValue("previewWidth"),
+         EncodableValue(static_cast<float>(width))},
+        {EncodableValue("previewHeight"),
+         EncodableValue(static_cast<float>(height))},
+    })));
+  }
+};
+
+void CameraImpl::OnStartPreviewFailed(const std::string& error) {
+  auto pending_result = GetPendingResultByType(PendingResultType::kInitialize);
+  if (pending_result) {
+    pending_result->Error("camera_error", error);
+  }
+};
+
+void CameraImpl::OnResumePreviewSucceeded() {
+  auto pending_result =
+      GetPendingResultByType(PendingResultType::kResumePreview);
+  if (pending_result) {
+    pending_result->Success();
+  }
+}
+
+void CameraImpl::OnResumePreviewFailed(const std::string& error) {
+  auto pending_result =
+      GetPendingResultByType(PendingResultType::kResumePreview);
+  if (pending_result) {
+    pending_result->Error("camera_error", error);
+  }
+}
+
+void CameraImpl::OnPausePreviewSucceeded() {
+  auto pending_result =
+      GetPendingResultByType(PendingResultType::kPausePreview);
+  if (pending_result) {
+    pending_result->Success();
+  }
+}
+
+void CameraImpl::OnPausePreviewFailed(const std::string& error) {
+  auto pending_result =
+      GetPendingResultByType(PendingResultType::kPausePreview);
+  if (pending_result) {
+    pending_result->Error("camera_error", error);
+  }
+}
+
+void CameraImpl::OnStartRecordSucceeded() {
+  auto pending_result = GetPendingResultByType(PendingResultType::kStartRecord);
+  if (pending_result) {
+    pending_result->Success();
+  }
+};
+
+void CameraImpl::OnStartRecordFailed(const std::string& error) {
+  auto pending_result = GetPendingResultByType(PendingResultType::kStartRecord);
+  if (pending_result) {
+    pending_result->Error("camera_error", error);
+  }
+};
+
+void CameraImpl::OnStopRecordSucceeded(const std::string& file_path) {
+  auto pending_result = GetPendingResultByType(PendingResultType::kStopRecord);
+  if (pending_result) {
+    pending_result->Success(EncodableValue(file_path));
+  }
+};
+
+void CameraImpl::OnStopRecordFailed(const std::string& error) {
+  auto pending_result = GetPendingResultByType(PendingResultType::kStopRecord);
+  if (pending_result) {
+    pending_result->Error("camera_error", error);
+  }
+};
+
+void CameraImpl::OnTakePictureSucceeded(const std::string& file_path) {
+  auto pending_result = GetPendingResultByType(PendingResultType::kTakePicture);
+  if (pending_result) {
+    pending_result->Success(EncodableValue(file_path));
+  }
+};
+
+void CameraImpl::OnTakePictureFailed(const std::string& error) {
+  auto pending_take_picture_result =
+      GetPendingResultByType(PendingResultType::kTakePicture);
+  if (pending_take_picture_result) {
+    pending_take_picture_result->Error("camera_error", error);
+  }
+};
+
+void CameraImpl::OnVideoRecordSucceeded(const std::string& file_path,
+                                        int64_t video_duration_ms) {
+  if (messenger_ && camera_id_ >= 0) {
+    auto channel = GetMethodChannel();
+
+    std::unique_ptr<EncodableValue> message_data =
+        std::make_unique<EncodableValue>(
+            EncodableMap({{EncodableValue("path"), EncodableValue(file_path)},
+                          {EncodableValue("maxVideoDuration"),
+                           EncodableValue(video_duration_ms)}}));
+
+    channel->InvokeMethod(kVideoRecordedEvent, std::move(message_data));
+  }
+}
+
+void CameraImpl::OnVideoRecordFailed(const std::string& error){};
+
+void CameraImpl::OnCaptureError(const std::string& error) {
+  if (messenger_ && camera_id_ >= 0) {
+    auto channel = GetMethodChannel();
+
+    std::unique_ptr<EncodableValue> message_data =
+        std::make_unique<EncodableValue>(EncodableMap(
+            {{EncodableValue("description"), EncodableValue(error)}}));
+    channel->InvokeMethod(kErrorEvent, std::move(message_data));
+  }
+
+  SendErrorForPendingResults("capture_error", error);
+}
+
+void CameraImpl::OnCameraClosing() {
+  if (messenger_ && camera_id_ >= 0) {
+    auto channel = GetMethodChannel();
+    channel->InvokeMethod(kCameraClosingEvent,
+                          std::move(std::make_unique<EncodableValue>()));
+  }
+}
+
+}  // namespace camera_windows
diff --git a/packages/camera/camera_windows/windows/camera.h b/packages/camera/camera_windows/windows/camera.h
new file mode 100644
index 0000000..6996231
--- /dev/null
+++ b/packages/camera/camera_windows/windows/camera.h
@@ -0,0 +1,194 @@
+// Copyright 2013 The Flutter Authors. All rights reserved.
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+#ifndef PACKAGES_CAMERA_CAMERA_WINDOWS_WINDOWS_CAMERA_H_
+#define PACKAGES_CAMERA_CAMERA_WINDOWS_WINDOWS_CAMERA_H_
+
+#include <flutter/method_channel.h>
+#include <flutter/standard_method_codec.h>
+
+#include <functional>
+
+#include "capture_controller.h"
+
+namespace camera_windows {
+
+using flutter::EncodableMap;
+using flutter::MethodChannel;
+using flutter::MethodResult;
+
+// A set of result types that are stored
+// for processing asynchronous commands.
+enum class PendingResultType {
+  kCreateCamera,
+  kInitialize,
+  kTakePicture,
+  kStartRecord,
+  kStopRecord,
+  kPausePreview,
+  kResumePreview,
+};
+
+// Interface implemented by cameras.
+//
+// Access is provided to an associated |CaptureController|, which can be used
+// to capture video or photo from the camera.
+class Camera : public CaptureControllerListener {
+ public:
+  explicit Camera(const std::string& device_id) {}
+  virtual ~Camera() = default;
+
+  // Disallow copy and move.
+  Camera(const Camera&) = delete;
+  Camera& operator=(const Camera&) = delete;
+
+  // Tests if this camera has the specified device ID.
+  virtual bool HasDeviceId(std::string& device_id) const = 0;
+
+  // Tests if this camera has the specified camera ID.
+  virtual bool HasCameraId(int64_t camera_id) const = 0;
+
+  // Adds a pending result.
+  //
+  // Returns an error result if the result has already been added.
+  virtual bool AddPendingResult(PendingResultType type,
+                                std::unique_ptr<MethodResult<>> result) = 0;
+
+  // Checks if a pending result of the specified type already exists.
+  virtual bool HasPendingResultByType(PendingResultType type) const = 0;
+
+  // Returns a |CaptureController| that allows capturing video or still photos
+  // from this camera.
+  virtual camera_windows::CaptureController* GetCaptureController() = 0;
+
+  // Initializes this camera and its associated capture controller.
+  virtual void InitCamera(flutter::TextureRegistrar* texture_registrar,
+                          flutter::BinaryMessenger* messenger,
+                          bool record_audio,
+                          ResolutionPreset resolution_preset) = 0;
+};
+
+// Concrete implementation of the |Camera| interface.
+//
+// This implementation is responsible for initializing the capture controller,
+// listening for camera events, processing pending results, and notifying
+// application code of processed events via the method channel.
+class CameraImpl : public Camera {
+ public:
+  explicit CameraImpl(const std::string& device_id);
+  virtual ~CameraImpl();
+
+  // Disallow copy and move.
+  CameraImpl(const CameraImpl&) = delete;
+  CameraImpl& operator=(const CameraImpl&) = delete;
+
+  // CaptureControllerListener
+  void OnCreateCaptureEngineSucceeded(int64_t texture_id) override;
+  void OnCreateCaptureEngineFailed(const std::string& error) override;
+  void OnStartPreviewSucceeded(int32_t width, int32_t height) override;
+  void OnStartPreviewFailed(const std::string& error) override;
+  void OnPausePreviewSucceeded() override;
+  void OnPausePreviewFailed(const std::string& error) override;
+  void OnResumePreviewSucceeded() override;
+  void OnResumePreviewFailed(const std::string& error) override;
+  void OnStartRecordSucceeded() override;
+  void OnStartRecordFailed(const std::string& error) override;
+  void OnStopRecordSucceeded(const std::string& file_path) override;
+  void OnStopRecordFailed(const std::string& error) override;
+  void OnTakePictureSucceeded(const std::string& file_path) override;
+  void OnTakePictureFailed(const std::string& error) override;
+  void OnVideoRecordSucceeded(const std::string& file_path,
+                              int64_t video_duration) override;
+  void OnVideoRecordFailed(const std::string& error) override;
+  void OnCaptureError(const std::string& error) override;
+
+  // Camera
+  bool HasDeviceId(std::string& device_id) const override {
+    return device_id_ == device_id;
+  }
+  bool HasCameraId(int64_t camera_id) const override {
+    return camera_id_ == camera_id;
+  }
+  bool AddPendingResult(PendingResultType type,
+                        std::unique_ptr<MethodResult<>> result) override;
+  bool HasPendingResultByType(PendingResultType type) const override;
+  camera_windows::CaptureController* GetCaptureController() override {
+    return capture_controller_.get();
+  }
+  void InitCamera(flutter::TextureRegistrar* texture_registrar,
+                  flutter::BinaryMessenger* messenger, bool record_audio,
+                  ResolutionPreset resolution_preset) override;
+
+  // Initializes the camera and its associated capture controller.
+  //
+  // This is a convenience method called by |InitCamera| but also used in
+  // tests.
+  void InitCamera(
+      std::unique_ptr<CaptureControllerFactory> capture_controller_factory,
+      flutter::TextureRegistrar* texture_registrar,
+      flutter::BinaryMessenger* messenger, bool record_audio,
+      ResolutionPreset resolution_preset);
+
+ private:
+  // Loops through all pending results and calls their error handler with given
+  // error ID and description. Pending results are cleared in the process.
+  //
+  // error_code: A string error code describing the error.
+  // error_message: A user-readable error message (optional).
+  void SendErrorForPendingResults(const std::string& error_code,
+                                  const std::string& descripion);
+
+  // Called when camera is disposed.
+  // Sends camera closing message to the cameras method channel.
+  void OnCameraClosing();
+
+  // Initializes method channel instance and returns pointer it.
+  MethodChannel<>* GetMethodChannel();
+
+  // Finds pending result by type.
+  // Returns nullptr if type is not present.
+  std::unique_ptr<MethodResult<>> GetPendingResultByType(
+      PendingResultType type);
+
+  std::map<PendingResultType, std::unique_ptr<MethodResult<>>> pending_results_;
+  std::unique_ptr<CaptureController> capture_controller_;
+  std::unique_ptr<MethodChannel<>> camera_channel_;
+  flutter::BinaryMessenger* messenger_ = nullptr;
+  int64_t camera_id_ = -1;
+  std::string device_id_;
+};
+
+// Factory class for creating |Camera| instances from a specified device ID.
+class CameraFactory {
+ public:
+  CameraFactory() {}
+  virtual ~CameraFactory() = default;
+
+  // Disallow copy and move.
+  CameraFactory(const CameraFactory&) = delete;
+  CameraFactory& operator=(const CameraFactory&) = delete;
+
+  // Creates camera for given device id.
+  virtual std::unique_ptr<Camera> CreateCamera(
+      const std::string& device_id) = 0;
+};
+
+// Concrete implementation of |CameraFactory|.
+class CameraFactoryImpl : public CameraFactory {
+ public:
+  CameraFactoryImpl() {}
+  virtual ~CameraFactoryImpl() = default;
+
+  // Disallow copy and move.
+  CameraFactoryImpl(const CameraFactoryImpl&) = delete;
+  CameraFactoryImpl& operator=(const CameraFactoryImpl&) = delete;
+
+  std::unique_ptr<Camera> CreateCamera(const std::string& device_id) override {
+    return std::make_unique<CameraImpl>(device_id);
+  }
+};
+
+}  // namespace camera_windows
+
+#endif  // PACKAGES_CAMERA_CAMERA_WINDOWS_WINDOWS_CAMERA_H_
diff --git a/packages/camera/camera_windows/windows/camera_plugin.cpp b/packages/camera/camera_windows/windows/camera_plugin.cpp
new file mode 100644
index 0000000..3b795e0
--- /dev/null
+++ b/packages/camera/camera_windows/windows/camera_plugin.cpp
@@ -0,0 +1,594 @@
+// Copyright 2013 The Flutter Authors. All rights reserved.
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+#include "camera_plugin.h"
+
+#include <flutter/flutter_view.h>
+#include <flutter/method_channel.h>
+#include <flutter/plugin_registrar_windows.h>
+#include <flutter/standard_method_codec.h>
+#include <mfapi.h>
+#include <mfidl.h>
+#include <shlobj.h>
+#include <shobjidl.h>
+#include <windows.h>
+
+#include <cassert>
+#include <chrono>
+#include <memory>
+
+#include "capture_device_info.h"
+#include "com_heap_ptr.h"
+#include "string_utils.h"
+
+namespace camera_windows {
+using flutter::EncodableList;
+using flutter::EncodableMap;
+using flutter::EncodableValue;
+
+namespace {
+
+// Channel events
+constexpr char kChannelName[] = "plugins.flutter.io/camera_windows";
+
+constexpr char kAvailableCamerasMethod[] = "availableCameras";
+constexpr char kCreateMethod[] = "create";
+constexpr char kInitializeMethod[] = "initialize";
+constexpr char kTakePictureMethod[] = "takePicture";
+constexpr char kStartVideoRecordingMethod[] = "startVideoRecording";
+constexpr char kStopVideoRecordingMethod[] = "stopVideoRecording";
+constexpr char kPausePreview[] = "pausePreview";
+constexpr char kResumePreview[] = "resumePreview";
+constexpr char kDisposeMethod[] = "dispose";
+
+constexpr char kCameraNameKey[] = "cameraName";
+constexpr char kResolutionPresetKey[] = "resolutionPreset";
+constexpr char kEnableAudioKey[] = "enableAudio";
+
+constexpr char kCameraIdKey[] = "cameraId";
+constexpr char kMaxVideoDurationKey[] = "maxVideoDuration";
+
+constexpr char kResolutionPresetValueLow[] = "low";
+constexpr char kResolutionPresetValueMedium[] = "medium";
+constexpr char kResolutionPresetValueHigh[] = "high";
+constexpr char kResolutionPresetValueVeryHigh[] = "veryHigh";
+constexpr char kResolutionPresetValueUltraHigh[] = "ultraHigh";
+constexpr char kResolutionPresetValueMax[] = "max";
+
+const std::string kPictureCaptureExtension = "jpeg";
+const std::string kVideoCaptureExtension = "mp4";
+
+// Looks for |key| in |map|, returning the associated value if it is present, or
+// a nullptr if not.
+const EncodableValue* ValueOrNull(const EncodableMap& map, const char* key) {
+  auto it = map.find(EncodableValue(key));
+  if (it == map.end()) {
+    return nullptr;
+  }
+  return &(it->second);
+}
+
+// Looks for |key| in |map|, returning the associated int64 value if it is
+// present, or std::nullopt if not.
+std::optional<int64_t> GetInt64ValueOrNull(const EncodableMap& map,
+                                           const char* key) {
+  auto value = ValueOrNull(map, key);
+  if (!value) {
+    return std::nullopt;
+  }
+
+  if (std::holds_alternative<int32_t>(*value)) {
+    return static_cast<int64_t>(std::get<int32_t>(*value));
+  }
+  auto val64 = std::get_if<int64_t>(value);
+  if (!val64) {
+    return std::nullopt;
+  }
+  return *val64;
+}
+
+// Parses resolution preset argument to enum value.
+ResolutionPreset ParseResolutionPreset(const std::string& resolution_preset) {
+  if (resolution_preset.compare(kResolutionPresetValueLow) == 0) {
+    return ResolutionPreset::kLow;
+  } else if (resolution_preset.compare(kResolutionPresetValueMedium) == 0) {
+    return ResolutionPreset::kMedium;
+  } else if (resolution_preset.compare(kResolutionPresetValueHigh) == 0) {
+    return ResolutionPreset::kHigh;
+  } else if (resolution_preset.compare(kResolutionPresetValueVeryHigh) == 0) {
+    return ResolutionPreset::kVeryHigh;
+  } else if (resolution_preset.compare(kResolutionPresetValueUltraHigh) == 0) {
+    return ResolutionPreset::kUltraHigh;
+  } else if (resolution_preset.compare(kResolutionPresetValueMax) == 0) {
+    return ResolutionPreset::kMax;
+  }
+  return ResolutionPreset::kAuto;
+}
+
+// Builds CaptureDeviceInfo object from given device holding device name and id.
+std::unique_ptr<CaptureDeviceInfo> GetDeviceInfo(IMFActivate* device) {
+  assert(device);
+  auto device_info = std::make_unique<CaptureDeviceInfo>();
+  ComHeapPtr<wchar_t> name;
+  UINT32 name_size;
+
+  HRESULT hr = device->GetAllocatedString(MF_DEVSOURCE_ATTRIBUTE_FRIENDLY_NAME,
+                                          &name, &name_size);
+  if (FAILED(hr)) {
+    return device_info;
+  }
+
+  ComHeapPtr<wchar_t> id;
+  UINT32 id_size;
+  hr = device->GetAllocatedString(
+      MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_VIDCAP_SYMBOLIC_LINK, &id, &id_size);
+
+  if (FAILED(hr)) {
+    return device_info;
+  }
+
+  device_info->SetDisplayName(Utf8FromUtf16(std::wstring(name, name_size)));
+  device_info->SetDeviceID(Utf8FromUtf16(std::wstring(id, id_size)));
+  return device_info;
+}
+
+// Builds datetime string from current time.
+// Used as part of the filenames for captured pictures and videos.
+std::string GetCurrentTimeString() {
+  std::chrono::system_clock::duration now =
+      std::chrono::system_clock::now().time_since_epoch();
+
+  auto s = std::chrono::duration_cast<std::chrono::seconds>(now).count();
+  auto ms =
+      std::chrono::duration_cast<std::chrono::milliseconds>(now).count() % 1000;
+
+  struct tm newtime;
+  localtime_s(&newtime, &s);
+
+  std::string time_start = "";
+  time_start.resize(80);
+  size_t len =
+      strftime(&time_start[0], time_start.size(), "%Y_%m%d_%H%M%S_", &newtime);
+  if (len > 0) {
+    time_start.resize(len);
+  }
+
+  // Add milliseconds to make sure the filename is unique
+  return time_start + std::to_string(ms);
+}
+
+// Builds file path for picture capture.
+std::optional<std::string> GetFilePathForPicture() {
+  ComHeapPtr<wchar_t> known_folder_path;
+  HRESULT hr = SHGetKnownFolderPath(FOLDERID_Pictures, KF_FLAG_CREATE, nullptr,
+                                    &known_folder_path);
+  if (FAILED(hr)) {
+    return std::nullopt;
+  }
+
+  std::string path = Utf8FromUtf16(std::wstring(known_folder_path));
+
+  return path + "\\" + "PhotoCapture_" + GetCurrentTimeString() + "." +
+         kPictureCaptureExtension;
+}
+
+// Builds file path for video capture.
+std::optional<std::string> GetFilePathForVideo() {
+  ComHeapPtr<wchar_t> known_folder_path;
+  HRESULT hr = SHGetKnownFolderPath(FOLDERID_Videos, KF_FLAG_CREATE, nullptr,
+                                    &known_folder_path);
+  if (FAILED(hr)) {
+    return std::nullopt;
+  }
+
+  std::string path = Utf8FromUtf16(std::wstring(known_folder_path));
+
+  return path + "\\" + "VideoCapture_" + GetCurrentTimeString() + "." +
+         kVideoCaptureExtension;
+}
+}  // namespace
+
+// static
+void CameraPlugin::RegisterWithRegistrar(
+    flutter::PluginRegistrarWindows* registrar) {
+  auto channel = std::make_unique<flutter::MethodChannel<>>(
+      registrar->messenger(), kChannelName,
+      &flutter::StandardMethodCodec::GetInstance());
+
+  std::unique_ptr<CameraPlugin> plugin = std::make_unique<CameraPlugin>(
+      registrar->texture_registrar(), registrar->messenger());
+
+  channel->SetMethodCallHandler(
+      [plugin_pointer = plugin.get()](const auto& call, auto result) {
+        plugin_pointer->HandleMethodCall(call, std::move(result));
+      });
+
+  registrar->AddPlugin(std::move(plugin));
+}
+
+CameraPlugin::CameraPlugin(flutter::TextureRegistrar* texture_registrar,
+                           flutter::BinaryMessenger* messenger)
+    : texture_registrar_(texture_registrar),
+      messenger_(messenger),
+      camera_factory_(std::make_unique<CameraFactoryImpl>()) {}
+
+CameraPlugin::CameraPlugin(flutter::TextureRegistrar* texture_registrar,
+                           flutter::BinaryMessenger* messenger,
+                           std::unique_ptr<CameraFactory> camera_factory)
+    : texture_registrar_(texture_registrar),
+      messenger_(messenger),
+      camera_factory_(std::move(camera_factory)) {}
+
+CameraPlugin::~CameraPlugin() {}
+
+void CameraPlugin::HandleMethodCall(
+    const flutter::MethodCall<>& method_call,
+    std::unique_ptr<flutter::MethodResult<>> result) {
+  const std::string& method_name = method_call.method_name();
+
+  if (method_name.compare(kAvailableCamerasMethod) == 0) {
+    return AvailableCamerasMethodHandler(std::move(result));
+  } else if (method_name.compare(kCreateMethod) == 0) {
+    const auto* arguments =
+        std::get_if<flutter::EncodableMap>(method_call.arguments());
+    assert(arguments);
+
+    return CreateMethodHandler(*arguments, std::move(result));
+  } else if (method_name.compare(kInitializeMethod) == 0) {
+    const auto* arguments =
+        std::get_if<flutter::EncodableMap>(method_call.arguments());
+    assert(arguments);
+
+    return this->InitializeMethodHandler(*arguments, std::move(result));
+  } else if (method_name.compare(kTakePictureMethod) == 0) {
+    const auto* arguments =
+        std::get_if<flutter::EncodableMap>(method_call.arguments());
+    assert(arguments);
+
+    return TakePictureMethodHandler(*arguments, std::move(result));
+  } else if (method_name.compare(kStartVideoRecordingMethod) == 0) {
+    const auto* arguments =
+        std::get_if<flutter::EncodableMap>(method_call.arguments());
+    assert(arguments);
+
+    return StartVideoRecordingMethodHandler(*arguments, std::move(result));
+  } else if (method_name.compare(kStopVideoRecordingMethod) == 0) {
+    const auto* arguments =
+        std::get_if<flutter::EncodableMap>(method_call.arguments());
+    assert(arguments);
+
+    return StopVideoRecordingMethodHandler(*arguments, std::move(result));
+  } else if (method_name.compare(kPausePreview) == 0) {
+    const auto* arguments =
+        std::get_if<flutter::EncodableMap>(method_call.arguments());
+    assert(arguments);
+
+    return PausePreviewMethodHandler(*arguments, std::move(result));
+  } else if (method_name.compare(kResumePreview) == 0) {
+    const auto* arguments =
+        std::get_if<flutter::EncodableMap>(method_call.arguments());
+    assert(arguments);
+
+    return ResumePreviewMethodHandler(*arguments, std::move(result));
+  } else if (method_name.compare(kDisposeMethod) == 0) {
+    const auto* arguments =
+        std::get_if<flutter::EncodableMap>(method_call.arguments());
+    assert(arguments);
+
+    return DisposeMethodHandler(*arguments, std::move(result));
+  } else {
+    result->NotImplemented();
+  }
+}
+
+Camera* CameraPlugin::GetCameraByDeviceId(std::string& device_id) {
+  for (auto it = begin(cameras_); it != end(cameras_); ++it) {
+    if ((*it)->HasDeviceId(device_id)) {
+      return it->get();
+    }
+  }
+  return nullptr;
+}
+
+Camera* CameraPlugin::GetCameraByCameraId(int64_t camera_id) {
+  for (auto it = begin(cameras_); it != end(cameras_); ++it) {
+    if ((*it)->HasCameraId(camera_id)) {
+      return it->get();
+    }
+  }
+  return nullptr;
+}
+
+void CameraPlugin::DisposeCameraByCameraId(int64_t camera_id) {
+  for (auto it = begin(cameras_); it != end(cameras_); ++it) {
+    if ((*it)->HasCameraId(camera_id)) {
+      cameras_.erase(it);
+      return;
+    }
+  }
+}
+
+void CameraPlugin::AvailableCamerasMethodHandler(
+    std::unique_ptr<flutter::MethodResult<>> result) {
+  // Enumerate devices.
+  ComHeapPtr<IMFActivate*> devices;
+  UINT32 count = 0;
+  if (!this->EnumerateVideoCaptureDeviceSources(&devices, &count)) {
+    result->Error("System error", "Failed to get available cameras");
+    // No need to free devices here, cos allocation failed.
+    return;
+  }
+
+  if (count == 0) {
+    result->Success(EncodableValue(EncodableList()));
+    return;
+  }
+
+  // Format found devices to the response.
+  EncodableList devices_list;
+  for (UINT32 i = 0; i < count; ++i) {
+    auto device_info = GetDeviceInfo(devices[i]);
+    auto deviceName = device_info->GetUniqueDeviceName();
+
+    devices_list.push_back(EncodableMap({
+        {EncodableValue("name"), EncodableValue(deviceName)},
+        {EncodableValue("lensFacing"), EncodableValue("front")},
+        {EncodableValue("sensorOrientation"), EncodableValue(0)},
+    }));
+  }
+
+  result->Success(std::move(EncodableValue(devices_list)));
+}
+
+bool CameraPlugin::EnumerateVideoCaptureDeviceSources(IMFActivate*** devices,
+                                                      UINT32* count) {
+  return CaptureControllerImpl::EnumerateVideoCaptureDeviceSources(devices,
+                                                                   count);
+}
+
+void CameraPlugin::CreateMethodHandler(
+    const EncodableMap& args, std::unique_ptr<flutter::MethodResult<>> result) {
+  // Parse enableAudio argument.
+  const auto* record_audio =
+      std::get_if<bool>(ValueOrNull(args, kEnableAudioKey));
+  if (!record_audio) {
+    return result->Error("argument_error",
+                         std::string(kEnableAudioKey) + " argument missing");
+  }
+
+  // Parse cameraName argument.
+  const auto* camera_name =
+      std::get_if<std::string>(ValueOrNull(args, kCameraNameKey));
+  if (!camera_name) {
+    return result->Error("argument_error",
+                         std::string(kCameraNameKey) + " argument missing");
+  }
+
+  auto device_info = std::make_unique<CaptureDeviceInfo>();
+  if (!device_info->ParseDeviceInfoFromCameraName(*camera_name)) {
+    return result->Error(
+        "camera_error", "Cannot parse argument " + std::string(kCameraNameKey));
+  }
+
+  auto device_id = device_info->GetDeviceId();
+  if (GetCameraByDeviceId(device_id)) {
+    return result->Error("camera_error",
+                         "Camera with given device id already exists. Existing "
+                         "camera must be disposed before creating it again.");
+  }
+
+  std::unique_ptr<camera_windows::Camera> camera =
+      camera_factory_->CreateCamera(device_id);
+
+  if (camera->HasPendingResultByType(PendingResultType::kCreateCamera)) {
+    return result->Error("camera_error",
+                         "Pending camera creation request exists");
+  }
+
+  if (camera->AddPendingResult(PendingResultType::kCreateCamera,
+                               std::move(result))) {
+    // Parse resolution preset argument.
+    const auto* resolution_preset_argument =
+        std::get_if<std::string>(ValueOrNull(args, kResolutionPresetKey));
+    ResolutionPreset resolution_preset;
+    if (resolution_preset_argument) {
+      resolution_preset = ParseResolutionPreset(*resolution_preset_argument);
+    } else {
+      resolution_preset = ResolutionPreset::kAuto;
+    }
+
+    camera->InitCamera(texture_registrar_, messenger_, *record_audio,
+                       resolution_preset);
+    cameras_.push_back(std::move(camera));
+  }
+}
+
+void CameraPlugin::InitializeMethodHandler(
+    const EncodableMap& args, std::unique_ptr<flutter::MethodResult<>> result) {
+  auto camera_id = GetInt64ValueOrNull(args, kCameraIdKey);
+  if (!camera_id) {
+    return result->Error("argument_error",
+                         std::string(kCameraIdKey) + " missing");
+  }
+
+  auto camera = GetCameraByCameraId(*camera_id);
+  if (!camera) {
+    return result->Error("camera_error", "Camera not created");
+  }
+
+  if (camera->HasPendingResultByType(PendingResultType::kInitialize)) {
+    return result->Error("camera_error",
+                         "Pending initialization request exists");
+  }
+
+  if (camera->AddPendingResult(PendingResultType::kInitialize,
+                               std::move(result))) {
+    auto cc = camera->GetCaptureController();
+    assert(cc);
+    cc->StartPreview();
+  }
+}
+
+void CameraPlugin::PausePreviewMethodHandler(
+    const EncodableMap& args, std::unique_ptr<flutter::MethodResult<>> result) {
+  auto camera_id = GetInt64ValueOrNull(args, kCameraIdKey);
+  if (!camera_id) {
+    return result->Error("argument_error",
+                         std::string(kCameraIdKey) + " missing");
+  }
+
+  auto camera = GetCameraByCameraId(*camera_id);
+  if (!camera) {
+    return result->Error("camera_error", "Camera not created");
+  }
+
+  if (camera->HasPendingResultByType(PendingResultType::kPausePreview)) {
+    return result->Error("camera_error",
+                         "Pending pause preview request exists");
+  }
+
+  if (camera->AddPendingResult(PendingResultType::kPausePreview,
+                               std::move(result))) {
+    auto cc = camera->GetCaptureController();
+    assert(cc);
+    cc->PausePreview();
+  }
+}
+
+void CameraPlugin::ResumePreviewMethodHandler(
+    const EncodableMap& args, std::unique_ptr<flutter::MethodResult<>> result) {
+  auto camera_id = GetInt64ValueOrNull(args, kCameraIdKey);
+  if (!camera_id) {
+    return result->Error("argument_error",
+                         std::string(kCameraIdKey) + " missing");
+  }
+
+  auto camera = GetCameraByCameraId(*camera_id);
+  if (!camera) {
+    return result->Error("camera_error", "Camera not created");
+  }
+
+  if (camera->HasPendingResultByType(PendingResultType::kResumePreview)) {
+    return result->Error("camera_error",
+                         "Pending resume preview request exists");
+  }
+
+  if (camera->AddPendingResult(PendingResultType::kResumePreview,
+                               std::move(result))) {
+    auto cc = camera->GetCaptureController();
+    assert(cc);
+    cc->ResumePreview();
+  }
+}
+
+void CameraPlugin::StartVideoRecordingMethodHandler(
+    const EncodableMap& args, std::unique_ptr<flutter::MethodResult<>> result) {
+  auto camera_id = GetInt64ValueOrNull(args, kCameraIdKey);
+  if (!camera_id) {
+    return result->Error("argument_error",
+                         std::string(kCameraIdKey) + " missing");
+  }
+
+  auto camera = GetCameraByCameraId(*camera_id);
+  if (!camera) {
+    return result->Error("camera_error", "Camera not created");
+  }
+
+  if (camera->HasPendingResultByType(PendingResultType::kStartRecord)) {
+    return result->Error("camera_error",
+                         "Pending start recording request exists");
+  }
+
+  int64_t max_video_duration_ms = -1;
+  auto requested_max_video_duration_ms =
+      std::get_if<std::int32_t>(ValueOrNull(args, kMaxVideoDurationKey));
+
+  if (requested_max_video_duration_ms != nullptr) {
+    max_video_duration_ms = *requested_max_video_duration_ms;
+  }
+
+  std::optional<std::string> path = GetFilePathForVideo();
+  if (path) {
+    if (camera->AddPendingResult(PendingResultType::kStartRecord,
+                                 std::move(result))) {
+      auto cc = camera->GetCaptureController();
+      assert(cc);
+      cc->StartRecord(*path, max_video_duration_ms);
+    }
+  } else {
+    return result->Error("system_error",
+                         "Failed to get path for video capture");
+  }
+}
+
+void CameraPlugin::StopVideoRecordingMethodHandler(
+    const EncodableMap& args, std::unique_ptr<flutter::MethodResult<>> result) {
+  auto camera_id = GetInt64ValueOrNull(args, kCameraIdKey);
+  if (!camera_id) {
+    return result->Error("argument_error",
+                         std::string(kCameraIdKey) + " missing");
+  }
+
+  auto camera = GetCameraByCameraId(*camera_id);
+  if (!camera) {
+    return result->Error("camera_error", "Camera not created");
+  }
+
+  if (camera->HasPendingResultByType(PendingResultType::kStopRecord)) {
+    return result->Error("camera_error",
+                         "Pending stop recording request exists");
+  }
+
+  if (camera->AddPendingResult(PendingResultType::kStopRecord,
+                               std::move(result))) {
+    auto cc = camera->GetCaptureController();
+    assert(cc);
+    cc->StopRecord();
+  }
+}
+
+void CameraPlugin::TakePictureMethodHandler(
+    const EncodableMap& args, std::unique_ptr<flutter::MethodResult<>> result) {
+  auto camera_id = GetInt64ValueOrNull(args, kCameraIdKey);
+  if (!camera_id) {
+    return result->Error("argument_error",
+                         std::string(kCameraIdKey) + " missing");
+  }
+
+  auto camera = GetCameraByCameraId(*camera_id);
+  if (!camera) {
+    return result->Error("camera_error", "Camera not created");
+  }
+
+  if (camera->HasPendingResultByType(PendingResultType::kTakePicture)) {
+    return result->Error("camera_error", "Pending take picture request exists");
+  }
+
+  std::optional<std::string> path = GetFilePathForPicture();
+  if (path) {
+    if (camera->AddPendingResult(PendingResultType::kTakePicture,
+                                 std::move(result))) {
+      auto cc = camera->GetCaptureController();
+      assert(cc);
+      cc->TakePicture(*path);
+    }
+  } else {
+    return result->Error("system_error",
+                         "Failed to get capture path for picture");
+  }
+}
+
+void CameraPlugin::DisposeMethodHandler(
+    const EncodableMap& args, std::unique_ptr<flutter::MethodResult<>> result) {
+  auto camera_id = GetInt64ValueOrNull(args, kCameraIdKey);
+  if (!camera_id) {
+    return result->Error("argument_error",
+                         std::string(kCameraIdKey) + " missing");
+  }
+
+  DisposeCameraByCameraId(*camera_id);
+  result->Success();
+}
+
+}  // namespace camera_windows
diff --git a/packages/camera/camera_windows/windows/camera_plugin.h b/packages/camera/camera_windows/windows/camera_plugin.h
new file mode 100644
index 0000000..1baa247
--- /dev/null
+++ b/packages/camera/camera_windows/windows/camera_plugin.h
@@ -0,0 +1,132 @@
+// Copyright 2013 The Flutter Authors. All rights reserved.
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+#ifndef PACKAGES_CAMERA_CAMERA_WINDOWS_WINDOWS_CAMERA_PLUGIN_H_
+#define PACKAGES_CAMERA_CAMERA_WINDOWS_WINDOWS_CAMERA_PLUGIN_H_
+
+#include <flutter/flutter_view.h>
+#include <flutter/method_channel.h>
+#include <flutter/plugin_registrar_windows.h>
+#include <flutter/standard_method_codec.h>
+
+#include <functional>
+
+#include "camera.h"
+#include "capture_controller.h"
+#include "capture_controller_listener.h"
+
+namespace camera_windows {
+using flutter::MethodResult;
+
+namespace test {
+namespace {
+// Forward declaration of test class.
+class MockCameraPlugin;
+}  // namespace
+}  // namespace test
+
+class CameraPlugin : public flutter::Plugin,
+                     public VideoCaptureDeviceEnumerator {
+ public:
+  static void RegisterWithRegistrar(flutter::PluginRegistrarWindows* registrar);
+
+  CameraPlugin(flutter::TextureRegistrar* texture_registrar,
+               flutter::BinaryMessenger* messenger);
+
+  // Creates a plugin instance with the given CameraFactory instance.
+  // Exists for unit testing with mock implementations.
+  CameraPlugin(flutter::TextureRegistrar* texture_registrar,
+               flutter::BinaryMessenger* messenger,
+               std::unique_ptr<CameraFactory> camera_factory);
+
+  virtual ~CameraPlugin();
+
+  // Disallow copy and move.
+  CameraPlugin(const CameraPlugin&) = delete;
+  CameraPlugin& operator=(const CameraPlugin&) = delete;
+
+  // Called when a method is called on plugin channel.
+  void HandleMethodCall(const flutter::MethodCall<>& method_call,
+                        std::unique_ptr<MethodResult<>> result);
+
+ private:
+  // Loops through cameras and returns camera
+  // with matching device_id or nullptr.
+  Camera* GetCameraByDeviceId(std::string& device_id);
+
+  // Loops through cameras and returns camera
+  // with matching camera_id or nullptr.
+  Camera* GetCameraByCameraId(int64_t camera_id);
+
+  // Disposes camera by camera id.
+  void DisposeCameraByCameraId(int64_t camera_id);
+
+  // Enumerates video capture devices.
+  bool EnumerateVideoCaptureDeviceSources(IMFActivate*** devices,
+                                          UINT32* count) override;
+
+  // Handles availableCameras method calls.
+  // Enumerates video capture devices and
+  // returns list of available camera devices.
+  void AvailableCamerasMethodHandler(
+      std::unique_ptr<flutter::MethodResult<>> result);
+
+  // Handles create method calls.
+  // Creates camera and initializes capture controller for requested device.
+  // Stores result object to be handled after request is processed.
+  void CreateMethodHandler(const EncodableMap& args,
+                           std::unique_ptr<MethodResult<>> result);
+
+  // Handles initialize method calls.
+  // Requests existing camera controller to start preview.
+  // Stores result object to be handled after request is processed.
+  void InitializeMethodHandler(const EncodableMap& args,
+                               std::unique_ptr<MethodResult<>> result);
+
+  // Handles takePicture method calls.
+  // Requests existing camera controller to take photo.
+  // Stores result object to be handled after request is processed.
+  void TakePictureMethodHandler(const EncodableMap& args,
+                                std::unique_ptr<MethodResult<>> result);
+
+  // Handles startVideoRecording method calls.
+  // Requests existing camera controller to start recording.
+  // Stores result object to be handled after request is processed.
+  void StartVideoRecordingMethodHandler(const EncodableMap& args,
+                                        std::unique_ptr<MethodResult<>> result);
+
+  // Handles stopVideoRecording method calls.
+  // Requests existing camera controller to stop recording.
+  // Stores result object to be handled after request is processed.
+  void StopVideoRecordingMethodHandler(const EncodableMap& args,
+                                       std::unique_ptr<MethodResult<>> result);
+
+  // Handles pausePreview method calls.
+  // Requests existing camera controller to pause recording.
+  // Stores result object to be handled after request is processed.
+  void PausePreviewMethodHandler(const EncodableMap& args,
+                                 std::unique_ptr<MethodResult<>> result);
+
+  // Handles resumePreview method calls.
+  // Requests existing camera controller to resume preview.
+  // Stores result object to be handled after request is processed.
+  void ResumePreviewMethodHandler(const EncodableMap& args,
+                                  std::unique_ptr<MethodResult<>> result);
+
+  // Handles dsipose method calls.
+  // Disposes camera if exists.
+  void DisposeMethodHandler(const EncodableMap& args,
+                            std::unique_ptr<MethodResult<>> result);
+
+  std::unique_ptr<CameraFactory> camera_factory_;
+  flutter::TextureRegistrar* texture_registrar_;
+  flutter::BinaryMessenger* messenger_;
+  std::vector<std::unique_ptr<Camera>> cameras_;
+
+  friend class camera_windows::test::MockCameraPlugin;
+};
+
+}  // namespace camera_windows
+
+#endif  // PACKAGES_CAMERA_CAMERA_WINDOWS_WINDOWS_CAMERA_PLUGIN_H_
diff --git a/packages/camera/camera_windows/windows/camera_windows.cpp b/packages/camera/camera_windows/windows/camera_windows.cpp
new file mode 100644
index 0000000..2d6b781
--- /dev/null
+++ b/packages/camera/camera_windows/windows/camera_windows.cpp
@@ -0,0 +1,16 @@
+// Copyright 2013 The Flutter Authors. All rights reserved.
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+#include "include/camera_windows/camera_windows.h"
+
+#include <flutter/plugin_registrar_windows.h>
+
+#include "camera_plugin.h"
+
+void CameraWindowsRegisterWithRegistrar(
+    FlutterDesktopPluginRegistrarRef registrar) {
+  camera_windows::CameraPlugin::RegisterWithRegistrar(
+      flutter::PluginRegistrarManager::GetInstance()
+          ->GetRegistrar<flutter::PluginRegistrarWindows>(registrar));
+}
diff --git a/packages/camera/camera_windows/windows/capture_controller.cpp b/packages/camera/camera_windows/windows/capture_controller.cpp
new file mode 100644
index 0000000..084b036
--- /dev/null
+++ b/packages/camera/camera_windows/windows/capture_controller.cpp
@@ -0,0 +1,861 @@
+// Copyright 2013 The Flutter Authors. All rights reserved.
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+#include "capture_controller.h"
+
+#include <comdef.h>
+#include <wincodec.h>
+#include <wrl/client.h>
+
+#include <cassert>
+#include <chrono>
+
+#include "com_heap_ptr.h"
+#include "photo_handler.h"
+#include "preview_handler.h"
+#include "record_handler.h"
+#include "string_utils.h"
+#include "texture_handler.h"
+
+namespace camera_windows {
+
+using Microsoft::WRL::ComPtr;
+
+CaptureControllerImpl::CaptureControllerImpl(
+    CaptureControllerListener* listener)
+    : capture_controller_listener_(listener), CaptureController(){};
+
+CaptureControllerImpl::~CaptureControllerImpl() {
+  ResetCaptureController();
+  capture_controller_listener_ = nullptr;
+};
+
+// static
+bool CaptureControllerImpl::EnumerateVideoCaptureDeviceSources(
+    IMFActivate*** devices, UINT32* count) {
+  ComPtr<IMFAttributes> attributes;
+
+  HRESULT hr = MFCreateAttributes(&attributes, 1);
+  if (FAILED(hr)) {
+    return false;
+  }
+
+  hr = attributes->SetGUID(MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE,
+                           MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_VIDCAP_GUID);
+  if (FAILED(hr)) {
+    return false;
+  }
+
+  hr = MFEnumDeviceSources(attributes.Get(), devices, count);
+  if (FAILED(hr)) {
+    return false;
+  }
+
+  return true;
+}
+
+HRESULT CaptureControllerImpl::CreateDefaultAudioCaptureSource() {
+  audio_source_ = nullptr;
+  ComHeapPtr<IMFActivate*> devices;
+  UINT32 count = 0;
+
+  ComPtr<IMFAttributes> attributes;
+  HRESULT hr = MFCreateAttributes(&attributes, 1);
+
+  if (SUCCEEDED(hr)) {
+    hr = attributes->SetGUID(MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE,
+                             MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_AUDCAP_GUID);
+  }
+
+  if (SUCCEEDED(hr)) {
+    hr = MFEnumDeviceSources(attributes.Get(), &devices, &count);
+  }
+
+  if (SUCCEEDED(hr) && count > 0) {
+    ComHeapPtr<wchar_t> audio_device_id;
+    UINT32 audio_device_id_size;
+
+    // Use first audio device.
+    hr = devices[0]->GetAllocatedString(
+        MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_AUDCAP_ENDPOINT_ID, &audio_device_id,
+        &audio_device_id_size);
+
+    if (SUCCEEDED(hr)) {
+      ComPtr<IMFAttributes> audio_capture_source_attributes;
+      hr = MFCreateAttributes(&audio_capture_source_attributes, 2);
+
+      if (SUCCEEDED(hr)) {
+        hr = audio_capture_source_attributes->SetGUID(
+            MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE,
+            MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_AUDCAP_GUID);
+      }
+
+      if (SUCCEEDED(hr)) {
+        hr = audio_capture_source_attributes->SetString(
+            MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_AUDCAP_ENDPOINT_ID,
+            audio_device_id);
+      }
+
+      if (SUCCEEDED(hr)) {
+        hr = MFCreateDeviceSource(audio_capture_source_attributes.Get(),
+                                  audio_source_.GetAddressOf());
+      }
+    }
+  }
+
+  return hr;
+}
+
+HRESULT CaptureControllerImpl::CreateVideoCaptureSourceForDevice(
+    const std::string& video_device_id) {
+  video_source_ = nullptr;
+
+  ComPtr<IMFAttributes> video_capture_source_attributes;
+
+  HRESULT hr = MFCreateAttributes(&video_capture_source_attributes, 2);
+  if (FAILED(hr)) {
+    return hr;
+  }
+
+  hr = video_capture_source_attributes->SetGUID(
+      MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE,
+      MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_VIDCAP_GUID);
+  if (FAILED(hr)) {
+    return hr;
+  }
+
+  hr = video_capture_source_attributes->SetString(
+      MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_VIDCAP_SYMBOLIC_LINK,
+      Utf16FromUtf8(video_device_id).c_str());
+  if (FAILED(hr)) {
+    return hr;
+  }
+
+  hr = MFCreateDeviceSource(video_capture_source_attributes.Get(),
+                            video_source_.GetAddressOf());
+  return hr;
+}
+
+HRESULT CaptureControllerImpl::CreateD3DManagerWithDX11Device() {
+  // TODO: Use existing ANGLE device
+
+  HRESULT hr = S_OK;
+  hr = D3D11CreateDevice(nullptr, D3D_DRIVER_TYPE_HARDWARE, nullptr,
+                         D3D11_CREATE_DEVICE_VIDEO_SUPPORT, nullptr, 0,
+                         D3D11_SDK_VERSION, &dx11_device_, nullptr, nullptr);
+  if (FAILED(hr)) {
+    return hr;
+  }
+
+  // Enable multithread protection
+  ComPtr<ID3D10Multithread> multi_thread;
+  hr = dx11_device_.As(&multi_thread);
+  if (FAILED(hr)) {
+    return hr;
+  }
+
+  multi_thread->SetMultithreadProtected(TRUE);
+
+  hr = MFCreateDXGIDeviceManager(&dx_device_reset_token_,
+                                 dxgi_device_manager_.GetAddressOf());
+  if (FAILED(hr)) {
+    return hr;
+  }
+
+  hr = dxgi_device_manager_->ResetDevice(dx11_device_.Get(),
+                                         dx_device_reset_token_);
+  return hr;
+}
+
+HRESULT CaptureControllerImpl::CreateCaptureEngine() {
+  assert(!video_device_id_.empty());
+
+  HRESULT hr = S_OK;
+  ComPtr<IMFAttributes> attributes;
+
+  // Creates capture engine only if not already initialized by test framework
+  if (!capture_engine_) {
+    ComPtr<IMFCaptureEngineClassFactory> capture_engine_factory;
+
+    hr = CoCreateInstance(CLSID_MFCaptureEngineClassFactory, nullptr,
+                          CLSCTX_INPROC_SERVER,
+                          IID_PPV_ARGS(&capture_engine_factory));
+    if (FAILED(hr)) {
+      return hr;
+    }
+
+    // Creates CaptureEngine.
+    hr = capture_engine_factory->CreateInstance(CLSID_MFCaptureEngine,
+                                                IID_PPV_ARGS(&capture_engine_));
+    if (FAILED(hr)) {
+      return hr;
+    }
+  }
+
+  hr = CreateD3DManagerWithDX11Device();
+
+  if (FAILED(hr)) {
+    return hr;
+  }
+
+  // Creates video source only if not already initialized by test framework
+  if (!video_source_) {
+    hr = CreateVideoCaptureSourceForDevice(video_device_id_);
+    if (FAILED(hr)) {
+      return hr;
+    }
+  }
+
+  // Creates audio source only if not already initialized by test framework
+  if (record_audio_ && !audio_source_) {
+    hr = CreateDefaultAudioCaptureSource();
+    if (FAILED(hr)) {
+      return hr;
+    }
+  }
+
+  if (!capture_engine_callback_handler_) {
+    capture_engine_callback_handler_ =
+        ComPtr<CaptureEngineListener>(new CaptureEngineListener(this));
+  }
+
+  hr = MFCreateAttributes(&attributes, 2);
+  if (FAILED(hr)) {
+    return hr;
+  }
+
+  hr = attributes->SetUnknown(MF_CAPTURE_ENGINE_D3D_MANAGER,
+                              dxgi_device_manager_.Get());
+  if (FAILED(hr)) {
+    return hr;
+  }
+
+  hr = attributes->SetUINT32(MF_CAPTURE_ENGINE_USE_VIDEO_DEVICE_ONLY,
+                             !record_audio_);
+  if (FAILED(hr)) {
+    return hr;
+  }
+
+  hr = capture_engine_->Initialize(capture_engine_callback_handler_.Get(),
+                                   attributes.Get(), audio_source_.Get(),
+                                   video_source_.Get());
+  return hr;
+}
+
+void CaptureControllerImpl::ResetCaptureController() {
+  if (record_handler_) {
+    if (record_handler_->IsContinuousRecording()) {
+      StopRecord();
+    } else if (record_handler_->IsTimedRecording()) {
+      StopTimedRecord();
+    }
+  }
+
+  if (preview_handler_) {
+    StopPreview();
+  }
+
+  // Shuts down the media foundation platform object.
+  // Releases all resources including threads.
+  // Application should call MFShutdown the same number of times as MFStartup
+  if (media_foundation_started_) {
+    MFShutdown();
+  }
+
+  // States
+  media_foundation_started_ = false;
+  capture_engine_state_ = CaptureEngineState::kNotInitialized;
+  preview_frame_width_ = 0;
+  preview_frame_height_ = 0;
+  capture_engine_callback_handler_ = nullptr;
+  capture_engine_ = nullptr;
+  audio_source_ = nullptr;
+  video_source_ = nullptr;
+  base_preview_media_type_ = nullptr;
+  base_capture_media_type_ = nullptr;
+
+  if (dxgi_device_manager_) {
+    dxgi_device_manager_->ResetDevice(dx11_device_.Get(),
+                                      dx_device_reset_token_);
+  }
+  dxgi_device_manager_ = nullptr;
+  dx11_device_ = nullptr;
+
+  record_handler_ = nullptr;
+  preview_handler_ = nullptr;
+  photo_handler_ = nullptr;
+  texture_handler_ = nullptr;
+}
+
+void CaptureControllerImpl::InitCaptureDevice(
+    flutter::TextureRegistrar* texture_registrar, const std::string& device_id,
+    bool record_audio, ResolutionPreset resolution_preset) {
+  assert(capture_controller_listener_);
+
+  if (IsInitialized()) {
+    return capture_controller_listener_->OnCreateCaptureEngineFailed(
+        "Capture device already initialized");
+  } else if (capture_engine_state_ == CaptureEngineState::kInitializing) {
+    return capture_controller_listener_->OnCreateCaptureEngineFailed(
+        "Capture device already initializing");
+  }
+
+  capture_engine_state_ = CaptureEngineState::kInitializing;
+  resolution_preset_ = resolution_preset;
+  record_audio_ = record_audio;
+  texture_registrar_ = texture_registrar;
+  video_device_id_ = device_id;
+
+  // MFStartup must be called before using Media Foundation.
+  if (!media_foundation_started_) {
+    HRESULT hr = MFStartup(MF_VERSION);
+
+    if (FAILED(hr)) {
+      capture_controller_listener_->OnCreateCaptureEngineFailed(
+          "Failed to create camera");
+      ResetCaptureController();
+      return;
+    }
+
+    media_foundation_started_ = true;
+  }
+
+  HRESULT hr = CreateCaptureEngine();
+  if (FAILED(hr)) {
+    capture_controller_listener_->OnCreateCaptureEngineFailed(
+        "Failed to create camera");
+    ResetCaptureController();
+    return;
+  }
+}
+
+void CaptureControllerImpl::TakePicture(const std::string& file_path) {
+  assert(capture_engine_callback_handler_);
+  assert(capture_engine_);
+
+  if (!IsInitialized()) {
+    return OnPicture(false, "Not initialized");
+  }
+
+  if (!base_capture_media_type_) {
+    // Enumerates mediatypes and finds media type for video capture.
+    if (FAILED(FindBaseMediaTypes())) {
+      return OnPicture(false, "Failed to initialize photo capture");
+    }
+  }
+
+  if (!photo_handler_) {
+    photo_handler_ = std::make_unique<PhotoHandler>();
+  } else if (photo_handler_->IsTakingPhoto()) {
+    return OnPicture(false, "Photo already requested");
+  }
+
+  // Check MF_CAPTURE_ENGINE_PHOTO_TAKEN event handling
+  // for response process.
+  if (!photo_handler_->TakePhoto(file_path, capture_engine_.Get(),
+                                 base_capture_media_type_.Get())) {
+    // Destroy photo handler on error cases to make sure state is resetted.
+    photo_handler_ = nullptr;
+    return OnPicture(false, "Failed to take photo");
+  }
+}
+
+uint32_t CaptureControllerImpl::GetMaxPreviewHeight() const {
+  switch (resolution_preset_) {
+    case ResolutionPreset::kLow:
+      return 240;
+      break;
+    case ResolutionPreset::kMedium:
+      return 480;
+      break;
+    case ResolutionPreset::kHigh:
+      return 720;
+      break;
+    case ResolutionPreset::kVeryHigh:
+      return 1080;
+      break;
+    case ResolutionPreset::kUltraHigh:
+      return 2160;
+      break;
+    case ResolutionPreset::kMax:
+    case ResolutionPreset::kAuto:
+    default:
+      // no limit.
+      return 0xffffffff;
+      break;
+  }
+}
+
+// Finds best mediat type for given source stream index and max height;
+bool FindBestMediaType(DWORD source_stream_index, IMFCaptureSource* source,
+                       IMFMediaType** target_media_type, uint32_t max_height,
+                       uint32_t* target_frame_width,
+                       uint32_t* target_frame_height,
+                       float minimum_accepted_framerate = 15.f) {
+  assert(source);
+  ComPtr<IMFMediaType> media_type;
+
+  uint32_t best_width = 0;
+  uint32_t best_height = 0;
+  float best_framerate = 0.f;
+
+  // Loop native media types.
+  for (int i = 0;; i++) {
+    if (FAILED(source->GetAvailableDeviceMediaType(
+            source_stream_index, i, media_type.GetAddressOf()))) {
+      break;
+    }
+
+    uint32_t frame_rate_numerator, frame_rate_denominator;
+    if (FAILED(MFGetAttributeRatio(media_type.Get(), MF_MT_FRAME_RATE,
+                                   &frame_rate_numerator,
+                                   &frame_rate_denominator)) ||
+        !frame_rate_denominator) {
+      continue;
+    }
+
+    float frame_rate =
+        static_cast<float>(frame_rate_numerator) / frame_rate_denominator;
+    if (frame_rate < minimum_accepted_framerate) {
+      continue;
+    }
+
+    uint32_t frame_width;
+    uint32_t frame_height;
+    if (SUCCEEDED(MFGetAttributeSize(media_type.Get(), MF_MT_FRAME_SIZE,
+                                     &frame_width, &frame_height))) {
+      // Update target mediatype
+      if (frame_height <= max_height &&
+          (best_width < frame_width || best_height < frame_height ||
+           best_framerate < frame_rate)) {
+        media_type.CopyTo(target_media_type);
+        best_width = frame_width;
+        best_height = frame_height;
+        best_framerate = frame_rate;
+      }
+    }
+  }
+
+  if (target_frame_width && target_frame_height) {
+    *target_frame_width = best_width;
+    *target_frame_height = best_height;
+  }
+
+  return *target_media_type != nullptr;
+}
+
+HRESULT CaptureControllerImpl::FindBaseMediaTypes() {
+  if (!IsInitialized()) {
+    return E_FAIL;
+  }
+
+  ComPtr<IMFCaptureSource> source;
+  HRESULT hr = capture_engine_->GetSource(&source);
+  if (FAILED(hr)) {
+    return hr;
+  }
+
+  // Find base media type for previewing.
+  if (!FindBestMediaType(
+          (DWORD)MF_CAPTURE_ENGINE_PREFERRED_SOURCE_STREAM_FOR_VIDEO_PREVIEW,
+          source.Get(), base_preview_media_type_.GetAddressOf(),
+          GetMaxPreviewHeight(), &preview_frame_width_,
+          &preview_frame_height_)) {
+    return E_FAIL;
+  }
+
+  // Find base media type for record and photo capture.
+  if (!FindBestMediaType(
+          (DWORD)MF_CAPTURE_ENGINE_PREFERRED_SOURCE_STREAM_FOR_VIDEO_RECORD,
+          source.Get(), base_capture_media_type_.GetAddressOf(), 0xffffffff,
+          nullptr, nullptr)) {
+    return E_FAIL;
+  }
+
+  return S_OK;
+}
+
+void CaptureControllerImpl::StartRecord(const std::string& file_path,
+                                        int64_t max_video_duration_ms) {
+  assert(capture_engine_);
+
+  if (!IsInitialized()) {
+    return OnRecordStarted(false,
+                           "Camera not initialized. Camera should be "
+                           "disposed and reinitialized.");
+  }
+
+  if (!base_capture_media_type_) {
+    // Enumerates mediatypes and finds media type for video capture.
+    if (FAILED(FindBaseMediaTypes())) {
+      return OnRecordStarted(false, "Failed to initialize video recording");
+    }
+  }
+
+  if (!record_handler_) {
+    record_handler_ = std::make_unique<RecordHandler>(record_audio_);
+  } else if (!record_handler_->CanStart()) {
+    return OnRecordStarted(
+        false,
+        "Recording cannot be started. Previous recording must be stopped "
+        "first.");
+  }
+
+  // Check MF_CAPTURE_ENGINE_RECORD_STARTED event handling for response
+  // process.
+  if (!record_handler_->StartRecord(file_path, max_video_duration_ms,
+                                    capture_engine_.Get(),
+                                    base_capture_media_type_.Get())) {
+    // Destroy record handler on error cases to make sure state is resetted.
+    record_handler_ = nullptr;
+    return OnRecordStarted(false, "Failed to start video recording");
+  }
+}
+
+void CaptureControllerImpl::StopRecord() {
+  assert(capture_controller_listener_);
+
+  if (!IsInitialized()) {
+    return OnRecordStopped(false,
+                           "Camera not initialized. Camera should be "
+                           "disposed and reinitialized.");
+  }
+
+  if (!record_handler_ && !record_handler_->CanStop()) {
+    return OnRecordStopped(false, "Recording cannot be stopped.");
+  }
+
+  // Check MF_CAPTURE_ENGINE_RECORD_STOPPED event handling for response
+  // process.
+  if (!record_handler_->StopRecord(capture_engine_.Get())) {
+    // Destroy record handler on error cases to make sure state is resetted.
+    record_handler_ = nullptr;
+    return OnRecordStopped(false, "Failed to stop video recording");
+  }
+}
+
+// Stops timed recording. Called internally when requested time is passed.
+// Check MF_CAPTURE_ENGINE_RECORD_STOPPED event handling for response process.
+void CaptureControllerImpl::StopTimedRecord() {
+  assert(capture_controller_listener_);
+  if (!record_handler_ || !record_handler_->IsTimedRecording()) {
+    return;
+  }
+
+  if (!record_handler_->StopRecord(capture_engine_.Get())) {
+    // Destroy record handler on error cases to make sure state is resetted.
+    record_handler_ = nullptr;
+    return capture_controller_listener_->OnVideoRecordFailed(
+        "Failed to record video");
+  }
+}
+
+// Starts capturing preview frames using preview handler
+// After first frame is captured, OnPreviewStarted is called
+void CaptureControllerImpl::StartPreview() {
+  assert(capture_engine_callback_handler_);
+  assert(capture_engine_);
+  assert(texture_handler_);
+
+  if (!IsInitialized() || !texture_handler_) {
+    return OnPreviewStarted(false,
+                            "Camera not initialized. Camera should be "
+                            "disposed and reinitialized.");
+  }
+
+  if (!base_preview_media_type_) {
+    // Enumerates mediatypes and finds media type for video capture.
+    if (FAILED(FindBaseMediaTypes())) {
+      return OnPreviewStarted(false, "Failed to initialize video preview");
+    }
+  }
+
+  texture_handler_->UpdateTextureSize(preview_frame_width_,
+                                      preview_frame_height_);
+
+  if (!preview_handler_) {
+    preview_handler_ = std::make_unique<PreviewHandler>();
+  } else if (preview_handler_->IsInitialized()) {
+    return OnPreviewStarted(true, "");
+  } else {
+    return OnPreviewStarted(false, "Preview already exists");
+  }
+
+  // Check MF_CAPTURE_ENGINE_PREVIEW_STARTED event handling for response
+  // process.
+  if (!preview_handler_->StartPreview(capture_engine_.Get(),
+                                      base_preview_media_type_.Get(),
+                                      capture_engine_callback_handler_.Get())) {
+    // Destroy preview handler on error cases to make sure state is resetted.
+    preview_handler_ = nullptr;
+    return OnPreviewStarted(false, "Failed to start video preview");
+  }
+}
+
+// Stops preview. Called by destructor
+// Use PausePreview and ResumePreview methods to for
+// pausing and resuming the preview.
+// Check MF_CAPTURE_ENGINE_PREVIEW_STOPPED event handling for response
+// process.
+void CaptureControllerImpl::StopPreview() {
+  assert(capture_engine_);
+
+  if (!IsInitialized() && !preview_handler_) {
+    return;
+  }
+
+  // Requests to stop preview.
+  preview_handler_->StopPreview(capture_engine_.Get());
+}
+
+// Marks preview as paused.
+// When preview is paused, captured frames are not processed for preview
+// and flutter texture is not updated
+void CaptureControllerImpl::PausePreview() {
+  assert(capture_controller_listener_);
+
+  if (!preview_handler_ && !preview_handler_->IsInitialized()) {
+    return capture_controller_listener_->OnPausePreviewFailed(
+        "Preview not started");
+  }
+
+  if (preview_handler_->PausePreview()) {
+    capture_controller_listener_->OnPausePreviewSucceeded();
+  } else {
+    capture_controller_listener_->OnPausePreviewFailed(
+        "Failed to pause preview");
+  }
+}
+
+// Marks preview as not paused.
+// When preview is not paused, captured frames are processed for preview
+// and flutter texture is updated.
+void CaptureControllerImpl::ResumePreview() {
+  assert(capture_controller_listener_);
+
+  if (!preview_handler_ && !preview_handler_->IsInitialized()) {
+    return capture_controller_listener_->OnResumePreviewFailed(
+        "Preview not started");
+  }
+
+  if (preview_handler_->ResumePreview()) {
+    capture_controller_listener_->OnResumePreviewSucceeded();
+  } else {
+    capture_controller_listener_->OnResumePreviewFailed(
+        "Failed to pause preview");
+  }
+}
+
+// Handles capture engine events.
+// Called via IMFCaptureEngineOnEventCallback implementation.
+// Implements CaptureEngineObserver::OnEvent.
+void CaptureControllerImpl::OnEvent(IMFMediaEvent* event) {
+  if (!IsInitialized() &&
+      capture_engine_state_ != CaptureEngineState::kInitializing) {
+    return;
+  }
+
+  GUID extended_type_guid;
+  if (SUCCEEDED(event->GetExtendedType(&extended_type_guid))) {
+    std::string error;
+
+    HRESULT event_hr;
+    if (FAILED(event->GetStatus(&event_hr))) {
+      return;
+    }
+
+    if (FAILED(event_hr)) {
+      // Reads system error
+      _com_error err(event_hr);
+      error = Utf8FromUtf16(err.ErrorMessage());
+    }
+
+    if (extended_type_guid == MF_CAPTURE_ENGINE_ERROR) {
+      OnCaptureEngineError(event_hr, error);
+    } else if (extended_type_guid == MF_CAPTURE_ENGINE_INITIALIZED) {
+      OnCaptureEngineInitialized(SUCCEEDED(event_hr), error);
+    } else if (extended_type_guid == MF_CAPTURE_ENGINE_PREVIEW_STARTED) {
+      // Preview is marked as started after first frame is captured.
+      // This is because, CaptureEngine might inform that preview is started
+      // even if error is thrown right after.
+    } else if (extended_type_guid == MF_CAPTURE_ENGINE_PREVIEW_STOPPED) {
+      OnPreviewStopped(SUCCEEDED(event_hr), error);
+    } else if (extended_type_guid == MF_CAPTURE_ENGINE_RECORD_STARTED) {
+      OnRecordStarted(SUCCEEDED(event_hr), error);
+    } else if (extended_type_guid == MF_CAPTURE_ENGINE_RECORD_STOPPED) {
+      OnRecordStopped(SUCCEEDED(event_hr), error);
+    } else if (extended_type_guid == MF_CAPTURE_ENGINE_PHOTO_TAKEN) {
+      OnPicture(SUCCEEDED(event_hr), error);
+    } else if (extended_type_guid == MF_CAPTURE_ENGINE_CAMERA_STREAM_BLOCKED) {
+      // TODO: Inform capture state to flutter.
+    } else if (extended_type_guid ==
+               MF_CAPTURE_ENGINE_CAMERA_STREAM_UNBLOCKED) {
+      // TODO: Inform capture state to flutter.
+    }
+  }
+}
+
+// Handles Picture event and informs CaptureControllerListener.
+void CaptureControllerImpl::OnPicture(bool success, const std::string& error) {
+  if (success && photo_handler_) {
+    if (capture_controller_listener_) {
+      std::string path = photo_handler_->GetPhotoPath();
+      capture_controller_listener_->OnTakePictureSucceeded(path);
+    }
+    photo_handler_->OnPhotoTaken();
+  } else {
+    if (capture_controller_listener_) {
+      capture_controller_listener_->OnTakePictureFailed(error);
+    }
+    // Destroy photo handler on error cases to make sure state is resetted.
+    photo_handler_ = nullptr;
+  }
+}
+
+// Handles CaptureEngineInitialized event and informs
+// CaptureControllerListener.
+void CaptureControllerImpl::OnCaptureEngineInitialized(
+    bool success, const std::string& error) {
+  if (capture_controller_listener_) {
+    // Create texture handler and register new texture.
+    texture_handler_ = std::make_unique<TextureHandler>(texture_registrar_);
+
+    int64_t texture_id = texture_handler_->RegisterTexture();
+    if (texture_id >= 0) {
+      capture_controller_listener_->OnCreateCaptureEngineSucceeded(texture_id);
+      capture_engine_state_ = CaptureEngineState::kInitialized;
+    } else {
+      capture_controller_listener_->OnCreateCaptureEngineFailed(
+          "Failed to create texture_id");
+      // Reset state
+      ResetCaptureController();
+    }
+  }
+}
+
+// Handles CaptureEngineError event and informs CaptureControllerListener.
+void CaptureControllerImpl::OnCaptureEngineError(HRESULT hr,
+                                                 const std::string& error) {
+  if (capture_controller_listener_) {
+    capture_controller_listener_->OnCaptureError(error);
+  }
+
+  // TODO: If MF_CAPTURE_ENGINE_ERROR is returned,
+  // should capture controller be reinitialized automatically?
+}
+
+// Handles PreviewStarted event and informs CaptureControllerListener.
+// This should be called only after first frame has been received or
+// in error cases.
+void CaptureControllerImpl::OnPreviewStarted(bool success,
+                                             const std::string& error) {
+  if (preview_handler_ && success) {
+    preview_handler_->OnPreviewStarted();
+  } else {
+    // Destroy preview handler on error cases to make sure state is resetted.
+    preview_handler_ = nullptr;
+  }
+
+  if (capture_controller_listener_) {
+    if (success && preview_frame_width_ > 0 && preview_frame_height_ > 0) {
+      capture_controller_listener_->OnStartPreviewSucceeded(
+          preview_frame_width_, preview_frame_height_);
+    } else {
+      capture_controller_listener_->OnStartPreviewFailed(error);
+    }
+  }
+};
+
+// Handles PreviewStopped event.
+void CaptureControllerImpl::OnPreviewStopped(bool success,
+                                             const std::string& error) {
+  // Preview handler is destroyed if preview is stopped as it
+  // does not have any use anymore.
+  preview_handler_ = nullptr;
+};
+
+// Handles RecordStarted event and informs CaptureControllerListener.
+void CaptureControllerImpl::OnRecordStarted(bool success,
+                                            const std::string& error) {
+  if (success && record_handler_) {
+    record_handler_->OnRecordStarted();
+    if (capture_controller_listener_) {
+      capture_controller_listener_->OnStartRecordSucceeded();
+    }
+  } else {
+    if (capture_controller_listener_) {
+      capture_controller_listener_->OnStartRecordFailed(error);
+    }
+
+    // Destroy record handler on error cases to make sure state is resetted.
+    record_handler_ = nullptr;
+  }
+};
+
+// Handles RecordStopped event and informs CaptureControllerListener.
+void CaptureControllerImpl::OnRecordStopped(bool success,
+                                            const std::string& error) {
+  if (capture_controller_listener_ && record_handler_) {
+    // Always calls OnStopRecord listener methods
+    // to handle separate stop record request for timed records.
+
+    if (success) {
+      std::string path = record_handler_->GetRecordPath();
+      capture_controller_listener_->OnStopRecordSucceeded(path);
+      if (record_handler_->IsTimedRecording()) {
+        capture_controller_listener_->OnVideoRecordSucceeded(
+            path, (record_handler_->GetRecordedDuration() / 1000));
+      }
+    } else {
+      capture_controller_listener_->OnStopRecordFailed(error);
+      if (record_handler_->IsTimedRecording()) {
+        capture_controller_listener_->OnVideoRecordFailed(error);
+      }
+    }
+  }
+
+  if (success && record_handler_) {
+    record_handler_->OnRecordStopped();
+  } else {
+    // Destroy record handler on error cases to make sure state is resetted.
+    record_handler_ = nullptr;
+  }
+}
+
+// Updates texture handlers buffer with given data.
+// Called via IMFCaptureEngineOnSampleCallback implementation.
+// Implements CaptureEngineObserver::UpdateBuffer.
+bool CaptureControllerImpl::UpdateBuffer(uint8_t* buffer,
+                                         uint32_t data_length) {
+  if (!texture_handler_) {
+    return false;
+  }
+  return texture_handler_->UpdateBuffer(buffer, data_length);
+}
+
+// Handles capture time update from each processed frame.
+// Stops timed recordings if requested recording duration has passed.
+// Called via IMFCaptureEngineOnSampleCallback implementation.
+// Implements CaptureEngineObserver::UpdateCaptureTime.
+void CaptureControllerImpl::UpdateCaptureTime(uint64_t capture_time_us) {
+  if (!IsInitialized()) {
+    return;
+  }
+
+  if (preview_handler_ && preview_handler_->IsStarting()) {
+    // Informs that first frame is captured succeffully and preview has
+    // started.
+    OnPreviewStarted(true, "");
+  }
+
+  // Checks if max_video_duration_ms is passed.
+  if (record_handler_) {
+    record_handler_->UpdateRecordingTime(capture_time_us);
+    if (record_handler_->ShouldStopTimedRecording()) {
+      StopTimedRecord();
+    }
+  }
+}
+
+}  // namespace camera_windows
diff --git a/packages/camera/camera_windows/windows/capture_controller.h b/packages/camera/camera_windows/windows/capture_controller.h
new file mode 100644
index 0000000..34e3781
--- /dev/null
+++ b/packages/camera/camera_windows/windows/capture_controller.h
@@ -0,0 +1,292 @@
+// Copyright 2013 The Flutter Authors. All rights reserved.
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+#ifndef PACKAGES_CAMERA_CAMERA_WINDOWS_WINDOWS_CAPTURE_CONTROLLER_H_
+#define PACKAGES_CAMERA_CAMERA_WINDOWS_WINDOWS_CAPTURE_CONTROLLER_H_
+
+#include <d3d11.h>
+#include <flutter/texture_registrar.h>
+#include <mfapi.h>
+#include <mfcaptureengine.h>
+#include <mferror.h>
+#include <mfidl.h>
+#include <windows.h>
+#include <wrl/client.h>
+
+#include <memory>
+#include <string>
+
+#include "capture_controller_listener.h"
+#include "capture_engine_listener.h"
+#include "photo_handler.h"
+#include "preview_handler.h"
+#include "record_handler.h"
+#include "texture_handler.h"
+
+namespace camera_windows {
+using flutter::TextureRegistrar;
+using Microsoft::WRL::ComPtr;
+
+// Camera resolution presets. Used to request a capture resolution.
+enum class ResolutionPreset {
+  // Automatic resolution, uses the highest resolution available.
+  kAuto,
+  // 240p (320x240)
+  kLow,
+  // 480p (720x480)
+  kMedium,
+  // 720p (1280x720)
+  kHigh,
+  // 1080p (1920x1080)
+  kVeryHigh,
+  // 2160p (4096x2160)
+  kUltraHigh,
+  // The highest resolution available.
+  kMax,
+};
+
+// Camera capture engine state.
+//
+// On creation, |CaptureControllers| start in state |kNotInitialized|.
+// On initialization, the capture controller transitions to the |kInitializing|
+// and then |kInitialized| state.
+enum class CaptureEngineState { kNotInitialized, kInitializing, kInitialized };
+
+// Interface for a class that enumerates video capture device sources.
+class VideoCaptureDeviceEnumerator {
+ private:
+  virtual bool EnumerateVideoCaptureDeviceSources(IMFActivate*** devices,
+                                                  UINT32* count) = 0;
+};
+
+// Interface implemented by capture controllers.
+//
+// Capture controllers are used to capture video streams or still photos from
+// their associated |Camera|.
+class CaptureController {
+ public:
+  CaptureController() {}
+  virtual ~CaptureController() = default;
+
+  // Disallow copy and move.
+  CaptureController(const CaptureController&) = delete;
+  CaptureController& operator=(const CaptureController&) = delete;
+
+  // Initializes the capture controller with the specified device id.
+  //
+  // texture_registrar: Pointer to Flutter TextureRegistrar instance. Used to
+  //                    register texture for capture preview.
+  // device_id:         A string that holds information of camera device id to
+  //                    be captured.
+  // record_audio:      A boolean value telling if audio should be captured on
+  //                    video recording.
+  // resolution_preset: Maximum capture resolution height.
+  virtual void InitCaptureDevice(TextureRegistrar* texture_registrar,
+                                 const std::string& device_id,
+                                 bool record_audio,
+                                 ResolutionPreset resolution_preset) = 0;
+
+  // Returns preview frame width
+  virtual uint32_t GetPreviewWidth() const = 0;
+
+  // Returns preview frame height
+  virtual uint32_t GetPreviewHeight() const = 0;
+
+  // Starts the preview.
+  virtual void StartPreview() = 0;
+
+  // Pauses the preview.
+  virtual void PausePreview() = 0;
+
+  // Resumes the preview.
+  virtual void ResumePreview() = 0;
+
+  // Starts recording video.
+  virtual void StartRecord(const std::string& file_path,
+                           int64_t max_video_duration_ms) = 0;
+
+  // Stops the current video recording.
+  virtual void StopRecord() = 0;
+
+  // Captures a still photo.
+  virtual void TakePicture(const std::string& file_path) = 0;
+};
+
+// Concrete implementation of the |CaptureController| interface.
+//
+// Handles the video preview stream via a |PreviewHandler| instance, video
+// capture via a |RecordHandler| instance, and still photo capture via a
+// |PhotoHandler| instance.
+class CaptureControllerImpl : public CaptureController,
+                              public CaptureEngineObserver {
+ public:
+  static bool EnumerateVideoCaptureDeviceSources(IMFActivate*** devices,
+                                                 UINT32* count);
+
+  explicit CaptureControllerImpl(CaptureControllerListener* listener);
+  virtual ~CaptureControllerImpl();
+
+  // Disallow copy and move.
+  CaptureControllerImpl(const CaptureControllerImpl&) = delete;
+  CaptureControllerImpl& operator=(const CaptureControllerImpl&) = delete;
+
+  // CaptureController
+  void InitCaptureDevice(TextureRegistrar* texture_registrar,
+                         const std::string& device_id, bool record_audio,
+                         ResolutionPreset resolution_preset) override;
+  uint32_t GetPreviewWidth() const override { return preview_frame_width_; }
+  uint32_t GetPreviewHeight() const override { return preview_frame_height_; }
+  void StartPreview() override;
+  void PausePreview() override;
+  void ResumePreview() override;
+  void StartRecord(const std::string& file_path,
+                   int64_t max_video_duration_ms) override;
+  void StopRecord() override;
+  void TakePicture(const std::string& file_path) override;
+
+  // CaptureEngineObserver
+  void OnEvent(IMFMediaEvent* event) override;
+  bool IsReadyForSample() const override {
+    return capture_engine_state_ == CaptureEngineState::kInitialized &&
+           preview_handler_ && preview_handler_->IsRunning();
+  }
+  bool UpdateBuffer(uint8_t* data, uint32_t data_length) override;
+  void UpdateCaptureTime(uint64_t capture_time) override;
+
+  // Sets capture engine, for testing purposes.
+  void SetCaptureEngine(IMFCaptureEngine* capture_engine) {
+    capture_engine_ = capture_engine;
+  }
+
+  // Sets video source, for testing purposes.
+  void SetVideoSource(IMFMediaSource* video_source) {
+    video_source_ = video_source;
+  }
+
+  // Sets audio source, for testing purposes.
+  void SetAudioSource(IMFMediaSource* audio_source) {
+    audio_source_ = audio_source;
+  }
+
+ private:
+  // Helper function to return initialized state as boolean;
+  bool IsInitialized() const {
+    return capture_engine_state_ == CaptureEngineState::kInitialized;
+  }
+
+  // Resets capture controller state.
+  // This is called if capture engine creation fails or is disposed.
+  void ResetCaptureController();
+
+  // Returns max preview height calculated from resolution present.
+  uint32_t GetMaxPreviewHeight() const;
+
+  // Uses first audio source to capture audio.
+  // Note: Enumerating audio sources via platform interface is not supported.
+  HRESULT CreateDefaultAudioCaptureSource();
+
+  // Initializes video capture source from camera device.
+  HRESULT CreateVideoCaptureSourceForDevice(const std::string& video_device_id);
+
+  // Creates DX11 Device and D3D Manager.
+  HRESULT CreateD3DManagerWithDX11Device();
+
+  // Initializes capture engine object.
+  HRESULT CreateCaptureEngine();
+
+  // Enumerates video_sources media types and finds out best resolution
+  // for preview and video capture.
+  HRESULT FindBaseMediaTypes();
+
+  // Stops timed video record. Called internally when record handler when max
+  // recording time is exceeded.
+  void StopTimedRecord();
+
+  // Stops preview. Called internally on camera reset and dispose.
+  void StopPreview();
+
+  // Handles capture engine initalization event.
+  void OnCaptureEngineInitialized(bool success, const std::string& error);
+
+  // Handles capture engine errors.
+  void OnCaptureEngineError(HRESULT hr, const std::string& error);
+
+  // Handles picture events.
+  void OnPicture(bool success, const std::string& error);
+
+  // Handles preview started events.
+  void OnPreviewStarted(bool success, const std::string& error);
+
+  // Handles preview stopped events.
+  void OnPreviewStopped(bool success, const std::string& error);
+
+  // Handles record started events.
+  void OnRecordStarted(bool success, const std::string& error);
+
+  // Handles record stopped events.
+  void OnRecordStopped(bool success, const std::string& error);
+
+  bool media_foundation_started_ = false;
+  bool record_audio_ = false;
+  uint32_t preview_frame_width_ = 0;
+  uint32_t preview_frame_height_ = 0;
+  UINT dx_device_reset_token_ = 0;
+  std::unique_ptr<RecordHandler> record_handler_;
+  std::unique_ptr<PreviewHandler> preview_handler_;
+  std::unique_ptr<PhotoHandler> photo_handler_;
+  std::unique_ptr<TextureHandler> texture_handler_;
+  CaptureControllerListener* capture_controller_listener_;
+
+  std::string video_device_id_;
+  CaptureEngineState capture_engine_state_ =
+      CaptureEngineState::kNotInitialized;
+  ResolutionPreset resolution_preset_ = ResolutionPreset::kMedium;
+  ComPtr<IMFCaptureEngine> capture_engine_;
+  ComPtr<CaptureEngineListener> capture_engine_callback_handler_;
+  ComPtr<IMFDXGIDeviceManager> dxgi_device_manager_;
+  ComPtr<ID3D11Device> dx11_device_;
+  ComPtr<IMFMediaType> base_capture_media_type_;
+  ComPtr<IMFMediaType> base_preview_media_type_;
+  ComPtr<IMFMediaSource> video_source_;
+  ComPtr<IMFMediaSource> audio_source_;
+
+  TextureRegistrar* texture_registrar_ = nullptr;
+};
+
+// Inferface for factory classes that create |CaptureController| instances.
+class CaptureControllerFactory {
+ public:
+  CaptureControllerFactory() {}
+  virtual ~CaptureControllerFactory() = default;
+
+  // Disallow copy and move.
+  CaptureControllerFactory(const CaptureControllerFactory&) = delete;
+  CaptureControllerFactory& operator=(const CaptureControllerFactory&) = delete;
+
+  // Create and return a |CaptureController| that makes callbacks on the
+  // specified |CaptureControllerListener|, which must not be null.
+  virtual std::unique_ptr<CaptureController> CreateCaptureController(
+      CaptureControllerListener* listener) = 0;
+};
+
+// Concreate implementation of |CaptureControllerFactory|.
+class CaptureControllerFactoryImpl : public CaptureControllerFactory {
+ public:
+  CaptureControllerFactoryImpl() {}
+  virtual ~CaptureControllerFactoryImpl() = default;
+
+  // Disallow copy and move.
+  CaptureControllerFactoryImpl(const CaptureControllerFactoryImpl&) = delete;
+  CaptureControllerFactoryImpl& operator=(const CaptureControllerFactoryImpl&) =
+      delete;
+
+  std::unique_ptr<CaptureController> CreateCaptureController(
+      CaptureControllerListener* listener) override {
+    return std::make_unique<CaptureControllerImpl>(listener);
+  }
+};
+
+}  // namespace camera_windows
+
+#endif  // PACKAGES_CAMERA_CAMERA_WINDOWS_WINDOWS_CAPTURE_CONTROLLER_H_
diff --git a/packages/camera/camera_windows/windows/capture_controller_listener.h b/packages/camera/camera_windows/windows/capture_controller_listener.h
new file mode 100644
index 0000000..0e713ea
--- /dev/null
+++ b/packages/camera/camera_windows/windows/capture_controller_listener.h
@@ -0,0 +1,104 @@
+// Copyright 2013 The Flutter Authors. All rights reserved.
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+#ifndef PACKAGES_CAMERA_CAMERA_WINDOWS_WINDOWS_CAPTURE_CONTROLLER_LISTENER_H_
+#define PACKAGES_CAMERA_CAMERA_WINDOWS_WINDOWS_CAPTURE_CONTROLLER_LISTENER_H_
+
+#include <functional>
+
+namespace camera_windows {
+
+// Interface for classes that receives callbacks on events from the associated
+// |CaptureController|.
+class CaptureControllerListener {
+ public:
+  virtual ~CaptureControllerListener() = default;
+
+  // Called by CaptureController on successful capture engine initialization.
+  //
+  // texture_id: A 64bit integer id registered by TextureRegistrar
+  virtual void OnCreateCaptureEngineSucceeded(int64_t texture_id) = 0;
+
+  // Called by CaptureController if initializing the capture engine fails.
+  //
+  // error: A string describing the error.
+  virtual void OnCreateCaptureEngineFailed(const std::string& error) = 0;
+
+  // Called by CaptureController on successfully started preview.
+  //
+  // width: Preview frame width.
+  // height: Preview frame height.
+  virtual void OnStartPreviewSucceeded(int32_t width, int32_t height) = 0;
+
+  // Called by CaptureController if starting the preview fails.
+  //
+  // error: A string describing the error.
+  virtual void OnStartPreviewFailed(const std::string& error) = 0;
+
+  // Called by CaptureController on successfully paused preview.
+  virtual void OnPausePreviewSucceeded() = 0;
+
+  // Called by CaptureController if pausing the preview fails.
+  //
+  // error: A string describing the error.
+  virtual void OnPausePreviewFailed(const std::string& error) = 0;
+
+  // Called by CaptureController on successfully resumed preview.
+  virtual void OnResumePreviewSucceeded() = 0;
+
+  // Called by CaptureController if resuming the preview fails.
+  //
+  // error: A string describing the error.
+  virtual void OnResumePreviewFailed(const std::string& error) = 0;
+
+  // Called by CaptureController on successfully started recording.
+  virtual void OnStartRecordSucceeded() = 0;
+
+  // Called by CaptureController if starting the recording fails.
+  //
+  // error: A string describing the error.
+  virtual void OnStartRecordFailed(const std::string& error) = 0;
+
+  // Called by CaptureController on successfully stopped recording.
+  //
+  // file_path: Filesystem path of the recorded video file.
+  virtual void OnStopRecordSucceeded(const std::string& file_path) = 0;
+
+  // Called by CaptureController if stopping the recording fails.
+  //
+  // error: A string describing the error.
+  virtual void OnStopRecordFailed(const std::string& error) = 0;
+
+  // Called by CaptureController on successfully captured picture.
+  //
+  // file_path: Filesystem path of the captured image.
+  virtual void OnTakePictureSucceeded(const std::string& file_path) = 0;
+
+  // Called by CaptureController if taking picture fails.
+  //
+  // error: A string describing the error.
+  virtual void OnTakePictureFailed(const std::string& error) = 0;
+
+  // Called by CaptureController when timed recording is successfully recorded.
+  //
+  // file_path: Filesystem path of the captured image.
+  // video_duration: Duration of recorded video in milliseconds.
+  virtual void OnVideoRecordSucceeded(const std::string& file_path,
+                                      int64_t video_duration_ms) = 0;
+
+  // Called by CaptureController if timed recording fails.
+  //
+  // error: A string describing the error.
+  virtual void OnVideoRecordFailed(const std::string& error) = 0;
+
+  // Called by CaptureController if capture engine returns error.
+  // For example when camera is disconnected while on use.
+  //
+  // error: A string describing the error.
+  virtual void OnCaptureError(const std::string& error) = 0;
+};
+
+}  // namespace camera_windows
+
+#endif  // PACKAGES_CAMERA_CAMERA_WINDOWS_WINDOWS_CAPTURE_CONTROLLER_LISTENER_H_
diff --git a/packages/camera/camera_windows/windows/capture_device_info.cpp b/packages/camera/camera_windows/windows/capture_device_info.cpp
new file mode 100644
index 0000000..446056a
--- /dev/null
+++ b/packages/camera/camera_windows/windows/capture_device_info.cpp
@@ -0,0 +1,29 @@
+// Copyright 2013 The Flutter Authors. All rights reserved.
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+#include "capture_device_info.h"
+
+#include <memory>
+#include <string>
+
+namespace camera_windows {
+std::string CaptureDeviceInfo::GetUniqueDeviceName() const {
+  return display_name_ + " <" + device_id_ + ">";
+}
+
+bool CaptureDeviceInfo::ParseDeviceInfoFromCameraName(
+    const std::string& camera_name) {
+  size_t delimeter_index = camera_name.rfind(' ', camera_name.length());
+  if (delimeter_index != std::string::npos) {
+    auto deviceInfo = std::make_unique<CaptureDeviceInfo>();
+    display_name_ = camera_name.substr(0, delimeter_index);
+    device_id_ = camera_name.substr(delimeter_index + 2,
+                                    camera_name.length() - delimeter_index - 3);
+    return true;
+  }
+
+  return false;
+}
+
+}  // namespace camera_windows
diff --git a/packages/camera/camera_windows/windows/capture_device_info.h b/packages/camera/camera_windows/windows/capture_device_info.h
new file mode 100644
index 0000000..63ffa85
--- /dev/null
+++ b/packages/camera/camera_windows/windows/capture_device_info.h
@@ -0,0 +1,49 @@
+// Copyright 2013 The Flutter Authors. All rights reserved.
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+#ifndef PACKAGES_CAMERA_CAMERA_WINDOWS_WINDOWS_CAPTURE_DEVICE_INFO_H_
+#define PACKAGES_CAMERA_CAMERA_WINDOWS_WINDOWS_CAPTURE_DEVICE_INFO_H_
+
+#include <string>
+
+namespace camera_windows {
+
+// Name and device ID information for a capture device.
+class CaptureDeviceInfo {
+ public:
+  CaptureDeviceInfo() {}
+  virtual ~CaptureDeviceInfo() = default;
+
+  // Disallow copy and move.
+  CaptureDeviceInfo(const CaptureDeviceInfo&) = delete;
+  CaptureDeviceInfo& operator=(const CaptureDeviceInfo&) = delete;
+
+  // Build unique device name from display name and device id.
+  // Format: "display_name <device_id>".
+  std::string GetUniqueDeviceName() const;
+
+  // Parses display name and device id from unique device name format.
+  // Format: "display_name <device_id>".
+  bool CaptureDeviceInfo::ParseDeviceInfoFromCameraName(
+      const std::string& camera_name);
+
+  // Updates display name.
+  void SetDisplayName(const std::string& display_name) {
+    display_name_ = display_name;
+  }
+
+  // Updates device id.
+  void SetDeviceID(const std::string& device_id) { device_id_ = device_id; }
+
+  // Returns device id.
+  std::string GetDeviceId() const { return device_id_; }
+
+ private:
+  std::string display_name_;
+  std::string device_id_;
+};
+
+}  // namespace camera_windows
+
+#endif  // PACKAGES_CAMERA_CAMERA_WINDOWS_WINDOWS_CAPTURE_DEVICE_INFO_H_
diff --git a/packages/camera/camera_windows/windows/capture_engine_listener.cpp b/packages/camera/camera_windows/windows/capture_engine_listener.cpp
new file mode 100644
index 0000000..5425b38
--- /dev/null
+++ b/packages/camera/camera_windows/windows/capture_engine_listener.cpp
@@ -0,0 +1,90 @@
+
+// Copyright 2013 The Flutter Authors. All rights reserved.
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+#include "capture_engine_listener.h"
+
+#include <mfcaptureengine.h>
+#include <wrl/client.h>
+
+namespace camera_windows {
+
+using Microsoft::WRL::ComPtr;
+
+// IUnknown
+STDMETHODIMP_(ULONG) CaptureEngineListener::AddRef() {
+  return InterlockedIncrement(&ref_);
+}
+
+// IUnknown
+STDMETHODIMP_(ULONG)
+CaptureEngineListener::Release() {
+  LONG ref = InterlockedDecrement(&ref_);
+  if (ref == 0) {
+    delete this;
+  }
+  return ref;
+}
+
+// IUnknown
+STDMETHODIMP_(HRESULT)
+CaptureEngineListener::QueryInterface(const IID& riid, void** ppv) {
+  *ppv = nullptr;
+
+  if (riid == IID_IMFCaptureEngineOnEventCallback) {
+    *ppv = static_cast<IMFCaptureEngineOnEventCallback*>(this);
+    ((IUnknown*)*ppv)->AddRef();
+    return S_OK;
+  } else if (riid == IID_IMFCaptureEngineOnSampleCallback) {
+    *ppv = static_cast<IMFCaptureEngineOnSampleCallback*>(this);
+    ((IUnknown*)*ppv)->AddRef();
+    return S_OK;
+  }
+
+  return E_NOINTERFACE;
+}
+
+STDMETHODIMP CaptureEngineListener::OnEvent(IMFMediaEvent* event) {
+  if (observer_) {
+    observer_->OnEvent(event);
+  }
+  return S_OK;
+}
+
+// IMFCaptureEngineOnSampleCallback
+HRESULT CaptureEngineListener::OnSample(IMFSample* sample) {
+  HRESULT hr = S_OK;
+
+  if (this->observer_ && sample) {
+    LONGLONG raw_time_stamp = 0;
+    // Receives the presentation time, in 100-nanosecond units.
+    sample->GetSampleTime(&raw_time_stamp);
+
+    // Report time in microseconds.
+    this->observer_->UpdateCaptureTime(
+        static_cast<uint64_t>(raw_time_stamp / 10));
+
+    if (!this->observer_->IsReadyForSample()) {
+      // No texture target available or not previewing, just return status.
+      return hr;
+    }
+
+    ComPtr<IMFMediaBuffer> buffer;
+    hr = sample->ConvertToContiguousBuffer(&buffer);
+
+    // Draw the frame.
+    if (SUCCEEDED(hr) && buffer) {
+      DWORD max_length = 0;
+      DWORD current_length = 0;
+      uint8_t* data;
+      if (SUCCEEDED(buffer->Lock(&data, &max_length, &current_length))) {
+        this->observer_->UpdateBuffer(data, current_length);
+      }
+      hr = buffer->Unlock();
+    }
+  }
+  return hr;
+}
+
+}  // namespace camera_windows
diff --git a/packages/camera/camera_windows/windows/capture_engine_listener.h b/packages/camera/camera_windows/windows/capture_engine_listener.h
new file mode 100644
index 0000000..081e3ea
--- /dev/null
+++ b/packages/camera/camera_windows/windows/capture_engine_listener.h
@@ -0,0 +1,69 @@
+// Copyright 2013 The Flutter Authors. All rights reserved.
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+#ifndef PACKAGES_CAMERA_CAMERA_WINDOWS_WINDOWS_CAPTURE_ENGINE_LISTENER_H_
+#define PACKAGES_CAMERA_CAMERA_WINDOWS_WINDOWS_CAPTURE_ENGINE_LISTENER_H_
+
+#include <mfcaptureengine.h>
+
+#include <cassert>
+#include <functional>
+
+namespace camera_windows {
+
+// A class that implements callbacks for events from a |CaptureEngineListener|.
+class CaptureEngineObserver {
+ public:
+  virtual ~CaptureEngineObserver() = default;
+
+  // Returns true if sample can be processed.
+  virtual bool IsReadyForSample() const = 0;
+
+  // Handles Capture Engine media events.
+  virtual void OnEvent(IMFMediaEvent* event) = 0;
+
+  // Updates texture buffer
+  virtual bool UpdateBuffer(uint8_t* data, uint32_t new_length) = 0;
+
+  // Handles capture timestamps updates.
+  // Used to stop timed recordings when recorded time is exceeded.
+  virtual void UpdateCaptureTime(uint64_t capture_time) = 0;
+};
+
+// Listener for Windows Media Foundation capture engine events and samples.
+//
+// Events are redirected to observers for processing. Samples are preprosessed
+// and sent to the associated observer if it is ready to process samples.
+class CaptureEngineListener : public IMFCaptureEngineOnSampleCallback,
+                              public IMFCaptureEngineOnEventCallback {
+ public:
+  CaptureEngineListener(CaptureEngineObserver* observer) : observer_(observer) {
+    assert(observer);
+  }
+
+  ~CaptureEngineListener() {}
+
+  // Disallow copy and move.
+  CaptureEngineListener(const CaptureEngineListener&) = delete;
+  CaptureEngineListener& operator=(const CaptureEngineListener&) = delete;
+
+  // IUnknown
+  STDMETHODIMP_(ULONG) AddRef();
+  STDMETHODIMP_(ULONG) Release();
+  STDMETHODIMP_(HRESULT) QueryInterface(const IID& riid, void** ppv);
+
+  // IMFCaptureEngineOnEventCallback
+  STDMETHODIMP OnEvent(IMFMediaEvent* pEvent);
+
+  // IMFCaptureEngineOnSampleCallback
+  STDMETHODIMP_(HRESULT) OnSample(IMFSample* pSample);
+
+ private:
+  CaptureEngineObserver* observer_;
+  volatile ULONG ref_ = 0;
+};
+
+}  // namespace camera_windows
+
+#endif  // PACKAGES_CAMERA_CAMERA_WINDOWS_WINDOWS_CAPTURE_ENGINE_LISTENER_H_
diff --git a/packages/camera/camera_windows/windows/com_heap_ptr.h b/packages/camera/camera_windows/windows/com_heap_ptr.h
new file mode 100644
index 0000000..a314ed3
--- /dev/null
+++ b/packages/camera/camera_windows/windows/com_heap_ptr.h
@@ -0,0 +1,66 @@
+// Copyright 2013 The Flutter Authors. All rights reserved.
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+#ifndef PACKAGES_CAMERA_CAMERA_WINDOWS_WINDOWS_COMHEAPPTR_H_
+#define PACKAGES_CAMERA_CAMERA_WINDOWS_WINDOWS_COMHEAPPTR_H_
+
+#include <windows.h>
+
+#include <cassert>
+
+namespace camera_windows {
+// Wrapper for COM object for automatic memory release support
+// Destructor uses CoTaskMemFree to release memory allocations.
+template <typename T>
+class ComHeapPtr {
+ public:
+  ComHeapPtr() : p_obj_(nullptr) {}
+  ComHeapPtr(T* p_obj) : p_obj_(p_obj) {}
+
+  // Frees memory on destruction.
+  ~ComHeapPtr() { Free(); }
+
+  // Prevent copying / ownership transfer as not currently needed.
+  ComHeapPtr(ComHeapPtr const&) = delete;
+  ComHeapPtr& operator=(ComHeapPtr const&) = delete;
+
+  // Returns the pointer to the memory.
+  operator T*() { return p_obj_; }
+
+  // Returns the pointer to the memory.
+  T* operator->() {
+    assert(p_obj_ != nullptr);
+    return p_obj_;
+  }
+
+  // Returns the pointer to the memory.
+  const T* operator->() const {
+    assert(p_obj_ != nullptr);
+    return p_obj_;
+  }
+
+  // Returns the pointer to the memory.
+  T** operator&() {
+    // Wrapped object must be nullptr to avoid memory leaks.
+    // Object can be released with Reset(nullptr).
+    assert(p_obj_ == nullptr);
+    return &p_obj_;
+  }
+
+  // Frees the memory pointed to, and sets the pointer to nullptr.
+  void Free() {
+    if (p_obj_) {
+      CoTaskMemFree(p_obj_);
+    }
+    p_obj_ = nullptr;
+  }
+
+ private:
+  // Pointer to memory.
+  T* p_obj_;
+};
+
+}  // namespace camera_windows
+
+#endif  // PACKAGES_CAMERA_CAMERA_WINDOWS_WINDOWS_COMHEAPPTR_H_
diff --git a/packages/camera/camera_windows/windows/include/camera_windows/camera_windows.h b/packages/camera/camera_windows/windows/include/camera_windows/camera_windows.h
new file mode 100644
index 0000000..b1e28b8
--- /dev/null
+++ b/packages/camera/camera_windows/windows/include/camera_windows/camera_windows.h
@@ -0,0 +1,27 @@
+// Copyright 2013 The Flutter Authors. All rights reserved.
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+#ifndef PACKAGES_CAMERA_CAMERA_WINDOWS_WINDOWS_INCLUDE_CAMERA_WINDOWS_CAMERA_WINDOWS_H_
+#define PACKAGES_CAMERA_CAMERA_WINDOWS_WINDOWS_INCLUDE_CAMERA_WINDOWS_CAMERA_WINDOWS_H_
+
+#include <flutter_plugin_registrar.h>
+
+#ifdef FLUTTER_PLUGIN_IMPL
+#define FLUTTER_PLUGIN_EXPORT __declspec(dllexport)
+#else
+#define FLUTTER_PLUGIN_EXPORT __declspec(dllimport)
+#endif
+
+#if defined(__cplusplus)
+extern "C" {
+#endif
+
+FLUTTER_PLUGIN_EXPORT void CameraWindowsRegisterWithRegistrar(
+    FlutterDesktopPluginRegistrarRef registrar);
+
+#if defined(__cplusplus)
+}  // extern "C"
+#endif
+
+#endif  // PACKAGES_CAMERA_CAMERA_WINDOWS_WINDOWS_INCLUDE_CAMERA_WINDOWS_CAMERA_WINDOWS_H_
diff --git a/packages/camera/camera_windows/windows/photo_handler.cpp b/packages/camera/camera_windows/windows/photo_handler.cpp
new file mode 100644
index 0000000..10df230
--- /dev/null
+++ b/packages/camera/camera_windows/windows/photo_handler.cpp
@@ -0,0 +1,141 @@
+// Copyright 2013 The Flutter Authors. All rights reserved.
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+#include "photo_handler.h"
+
+#include <mfapi.h>
+#include <mfcaptureengine.h>
+#include <wincodec.h>
+
+#include <cassert>
+
+#include "capture_engine_listener.h"
+#include "string_utils.h"
+
+namespace camera_windows {
+
+using Microsoft::WRL::ComPtr;
+
+// Initializes media type for photo capture for jpeg images.
+HRESULT BuildMediaTypeForPhotoCapture(IMFMediaType* src_media_type,
+                                      IMFMediaType** photo_media_type,
+                                      GUID image_format) {
+  assert(src_media_type);
+  ComPtr<IMFMediaType> new_media_type;
+
+  HRESULT hr = MFCreateMediaType(&new_media_type);
+  if (FAILED(hr)) {
+    return hr;
+  }
+
+  // Clones everything from original media type.
+  hr = src_media_type->CopyAllItems(new_media_type.Get());
+  if (FAILED(hr)) {
+    return hr;
+  }
+
+  hr = new_media_type->SetGUID(MF_MT_MAJOR_TYPE, MFMediaType_Image);
+  if (FAILED(hr)) {
+    return hr;
+  }
+
+  hr = new_media_type->SetGUID(MF_MT_SUBTYPE, image_format);
+  if (FAILED(hr)) {
+    return hr;
+  }
+
+  new_media_type.CopyTo(photo_media_type);
+  return hr;
+}
+
+HRESULT PhotoHandler::InitPhotoSink(IMFCaptureEngine* capture_engine,
+                                    IMFMediaType* base_media_type) {
+  assert(capture_engine);
+  assert(base_media_type);
+
+  HRESULT hr = S_OK;
+
+  if (photo_sink_) {
+    // If photo sink already exists, only update output filename.
+    hr = photo_sink_->SetOutputFileName(Utf16FromUtf8(file_path_).c_str());
+
+    if (FAILED(hr)) {
+      photo_sink_ = nullptr;
+    }
+
+    return hr;
+  }
+
+  ComPtr<IMFMediaType> photo_media_type;
+  ComPtr<IMFCaptureSink> capture_sink;
+
+  // Get sink with photo type.
+  hr =
+      capture_engine->GetSink(MF_CAPTURE_ENGINE_SINK_TYPE_PHOTO, &capture_sink);
+  if (FAILED(hr)) {
+    return hr;
+  }
+
+  hr = capture_sink.As(&photo_sink_);
+  if (FAILED(hr)) {
+    photo_sink_ = nullptr;
+    return hr;
+  }
+
+  hr = photo_sink_->RemoveAllStreams();
+  if (FAILED(hr)) {
+    photo_sink_ = nullptr;
+    return hr;
+  }
+
+  hr = BuildMediaTypeForPhotoCapture(base_media_type,
+                                     photo_media_type.GetAddressOf(),
+                                     GUID_ContainerFormatJpeg);
+
+  if (FAILED(hr)) {
+    photo_sink_ = nullptr;
+    return hr;
+  }
+
+  DWORD photo_sink_stream_index;
+  hr = photo_sink_->AddStream(
+      (DWORD)MF_CAPTURE_ENGINE_PREFERRED_SOURCE_STREAM_FOR_PHOTO,
+      photo_media_type.Get(), nullptr, &photo_sink_stream_index);
+  if (FAILED(hr)) {
+    photo_sink_ = nullptr;
+    return hr;
+  }
+
+  hr = photo_sink_->SetOutputFileName(Utf16FromUtf8(file_path_).c_str());
+  if (FAILED(hr)) {
+    photo_sink_ = nullptr;
+    return hr;
+  }
+
+  return hr;
+}
+
+bool PhotoHandler::TakePhoto(const std::string& file_path,
+                             IMFCaptureEngine* capture_engine,
+                             IMFMediaType* base_media_type) {
+  assert(!file_path.empty());
+  assert(capture_engine);
+  assert(base_media_type);
+
+  file_path_ = file_path;
+
+  if (FAILED(InitPhotoSink(capture_engine, base_media_type))) {
+    return false;
+  }
+
+  photo_state_ = PhotoState::kTakingPhoto;
+  return SUCCEEDED(capture_engine->TakePhoto());
+}
+
+void PhotoHandler::OnPhotoTaken() {
+  assert(photo_state_ == PhotoState::kTakingPhoto);
+  photo_state_ = PhotoState::kIdle;
+}
+
+}  // namespace camera_windows
diff --git a/packages/camera/camera_windows/windows/photo_handler.h b/packages/camera/camera_windows/windows/photo_handler.h
new file mode 100644
index 0000000..ef0d98b
--- /dev/null
+++ b/packages/camera/camera_windows/windows/photo_handler.h
@@ -0,0 +1,80 @@
+// Copyright 2013 The Flutter Authors. All rights reserved.
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+#ifndef PACKAGES_CAMERA_CAMERA_WINDOWS_WINDOWS_PHOTO_HANDLER_H_
+#define PACKAGES_CAMERA_CAMERA_WINDOWS_WINDOWS_PHOTO_HANDLER_H_
+
+#include <mfapi.h>
+#include <mfcaptureengine.h>
+#include <wrl/client.h>
+
+#include <memory>
+#include <string>
+
+#include "capture_engine_listener.h"
+
+namespace camera_windows {
+using Microsoft::WRL::ComPtr;
+
+// Various states that the photo handler can be in.
+//
+// When created, the handler is in |kNotStarted| state and transtions in
+// sequential order through the states.
+enum class PhotoState {
+  kNotStarted,
+  kIdle,
+  kTakingPhoto,
+};
+
+// Handles photo sink initialization and tracks photo capture states.
+class PhotoHandler {
+ public:
+  PhotoHandler() {}
+  virtual ~PhotoHandler() = default;
+
+  // Prevent copying.
+  PhotoHandler(PhotoHandler const&) = delete;
+  PhotoHandler& operator=(PhotoHandler const&) = delete;
+
+  // Initializes photo sink if not initialized and requests the capture engine
+  // to take photo.
+  //
+  // Sets photo state to: kTakingPhoto.
+  // Returns false if photo cannot be taken.
+  //
+  // capture_engine:  A pointer to capture engine instance.
+  //                  Called to take the photo.
+  // base_media_type: A pointer to base media type used as a base
+  //                  for the actual photo capture media type.
+  // file_path:       A string that hold file path for photo capture.
+  bool TakePhoto(const std::string& file_path, IMFCaptureEngine* capture_engine,
+                 IMFMediaType* base_media_type);
+
+  // Set the photo handler recording state to: kIdel.
+  void OnPhotoTaken();
+
+  // Returns true if photo state is kIdle.
+  bool IsInitialized() const { return photo_state_ == PhotoState::kIdle; }
+
+  // Returns true if photo state is kTakingPhoto.
+  bool IsTakingPhoto() const {
+    return photo_state_ == PhotoState::kTakingPhoto;
+  }
+
+  // Returns the filesystem path of the captured photo.
+  std::string GetPhotoPath() const { return file_path_; }
+
+ private:
+  // Initializes record sink for video file capture.
+  HRESULT InitPhotoSink(IMFCaptureEngine* capture_engine,
+                        IMFMediaType* base_media_type);
+
+  std::string file_path_;
+  PhotoState photo_state_ = PhotoState::kNotStarted;
+  ComPtr<IMFCapturePhotoSink> photo_sink_;
+};
+
+}  // namespace camera_windows
+
+#endif  // PACKAGES_CAMERA_CAMERA_WINDOWS_WINDOWS_PHOTO_HANDLER_H_
diff --git a/packages/camera/camera_windows/windows/preview_handler.cpp b/packages/camera/camera_windows/windows/preview_handler.cpp
new file mode 100644
index 0000000..d7fb272
--- /dev/null
+++ b/packages/camera/camera_windows/windows/preview_handler.cpp
@@ -0,0 +1,164 @@
+// Copyright 2013 The Flutter Authors. All rights reserved.
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+#include "preview_handler.h"
+
+#include <mfapi.h>
+#include <mfcaptureengine.h>
+
+#include <cassert>
+
+#include "capture_engine_listener.h"
+#include "string_utils.h"
+
+namespace camera_windows {
+
+using Microsoft::WRL::ComPtr;
+
+// Initializes media type for video preview.
+HRESULT BuildMediaTypeForVideoPreview(IMFMediaType* src_media_type,
+                                      IMFMediaType** preview_media_type) {
+  assert(src_media_type);
+  ComPtr<IMFMediaType> new_media_type;
+
+  HRESULT hr = MFCreateMediaType(&new_media_type);
+  if (FAILED(hr)) {
+    return hr;
+  }
+
+  // Clones everything from original media type.
+  hr = src_media_type->CopyAllItems(new_media_type.Get());
+  if (FAILED(hr)) {
+    return hr;
+  }
+
+  // Changes subtype to MFVideoFormat_RGB32.
+  hr = new_media_type->SetGUID(MF_MT_SUBTYPE, MFVideoFormat_RGB32);
+  if (FAILED(hr)) {
+    return hr;
+  }
+
+  hr = new_media_type->SetUINT32(MF_MT_ALL_SAMPLES_INDEPENDENT, TRUE);
+  if (FAILED(hr)) {
+    return hr;
+  }
+
+  new_media_type.CopyTo(preview_media_type);
+
+  return hr;
+}
+
+HRESULT PreviewHandler::InitPreviewSink(
+    IMFCaptureEngine* capture_engine, IMFMediaType* base_media_type,
+    CaptureEngineListener* sample_callback) {
+  assert(capture_engine);
+  assert(base_media_type);
+  assert(sample_callback);
+
+  HRESULT hr = S_OK;
+
+  if (preview_sink_) {
+    // Preview sink already initialized.
+    return hr;
+  }
+
+  ComPtr<IMFMediaType> preview_media_type;
+  ComPtr<IMFCaptureSink> capture_sink;
+
+  // Get sink with preview type.
+  hr = capture_engine->GetSink(MF_CAPTURE_ENGINE_SINK_TYPE_PREVIEW,
+                               &capture_sink);
+  if (FAILED(hr)) {
+    return hr;
+  }
+
+  hr = capture_sink.As(&preview_sink_);
+  if (FAILED(hr)) {
+    preview_sink_ = nullptr;
+    return hr;
+  }
+
+  hr = preview_sink_->RemoveAllStreams();
+  if (FAILED(hr)) {
+    preview_sink_ = nullptr;
+    return hr;
+  }
+
+  hr = BuildMediaTypeForVideoPreview(base_media_type,
+                                     preview_media_type.GetAddressOf());
+
+  if (FAILED(hr)) {
+    preview_sink_ = nullptr;
+    return hr;
+  }
+
+  DWORD preview_sink_stream_index;
+  hr = preview_sink_->AddStream(
+      (DWORD)MF_CAPTURE_ENGINE_PREFERRED_SOURCE_STREAM_FOR_VIDEO_PREVIEW,
+      preview_media_type.Get(), nullptr, &preview_sink_stream_index);
+
+  if (FAILED(hr)) {
+    return hr;
+  }
+
+  hr = preview_sink_->SetSampleCallback(preview_sink_stream_index,
+                                        sample_callback);
+
+  if (FAILED(hr)) {
+    preview_sink_ = nullptr;
+    return hr;
+  }
+
+  return hr;
+}
+
+bool PreviewHandler::StartPreview(IMFCaptureEngine* capture_engine,
+                                  IMFMediaType* base_media_type,
+                                  CaptureEngineListener* sample_callback) {
+  assert(capture_engine);
+  assert(base_media_type);
+
+  if (FAILED(
+          InitPreviewSink(capture_engine, base_media_type, sample_callback))) {
+    return false;
+  }
+
+  preview_state_ = PreviewState::kStarting;
+  return SUCCEEDED(capture_engine->StartPreview());
+}
+
+bool PreviewHandler::StopPreview(IMFCaptureEngine* capture_engine) {
+  if (preview_state_ == PreviewState::kStarting ||
+      preview_state_ == PreviewState::kRunning ||
+      preview_state_ == PreviewState::kPaused) {
+    preview_state_ = PreviewState::kStopping;
+    return SUCCEEDED(capture_engine->StopPreview());
+  }
+  return false;
+}
+
+bool PreviewHandler::PausePreview() {
+  if (preview_state_ != PreviewState::kRunning) {
+    return false;
+  }
+  preview_state_ = PreviewState::kPaused;
+  return true;
+}
+
+bool PreviewHandler::ResumePreview() {
+  if (preview_state_ != PreviewState::kPaused) {
+    return false;
+  }
+  preview_state_ = PreviewState::kRunning;
+  return true;
+}
+
+void PreviewHandler::OnPreviewStarted() {
+  assert(preview_state_ == PreviewState::kStarting);
+  if (preview_state_ == PreviewState::kStarting) {
+    preview_state_ = PreviewState::kRunning;
+  }
+}
+
+}  // namespace camera_windows
diff --git a/packages/camera/camera_windows/windows/preview_handler.h b/packages/camera/camera_windows/windows/preview_handler.h
new file mode 100644
index 0000000..97b85fc
--- /dev/null
+++ b/packages/camera/camera_windows/windows/preview_handler.h
@@ -0,0 +1,103 @@
+// Copyright 2013 The Flutter Authors. All rights reserved.
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+#ifndef PACKAGES_CAMERA_CAMERA_WINDOWS_WINDOWS_PREVIEW_HANDLER_H_
+#define PACKAGES_CAMERA_CAMERA_WINDOWS_WINDOWS_PREVIEW_HANDLER_H_
+
+#include <mfapi.h>
+#include <mfcaptureengine.h>
+#include <wrl/client.h>
+
+#include <memory>
+#include <string>
+
+#include "capture_engine_listener.h"
+
+namespace camera_windows {
+using Microsoft::WRL::ComPtr;
+
+// States the preview handler can be in.
+//
+// When created, the handler starts in |kNotStarted| state and mostly
+// transitions in sequential order of the states. When the preview is running,
+// it can be set to the |kPaused| state and later resumed to |kRunning| state.
+enum class PreviewState {
+  kNotStarted,
+  kStarting,
+  kRunning,
+  kPaused,
+  kStopping
+};
+
+// Handler for a camera's video preview.
+//
+// Handles preview sink initialization and manages the state of the video
+// preview.
+class PreviewHandler {
+ public:
+  PreviewHandler() {}
+  virtual ~PreviewHandler() = default;
+
+  // Prevent copying.
+  PreviewHandler(PreviewHandler const&) = delete;
+  PreviewHandler& operator=(PreviewHandler const&) = delete;
+
+  // Initializes preview sink and requests capture engine to start previewing.
+  // Sets preview state to: starting.
+  // Returns false if recording cannot be started.
+  //
+  // capture_engine:  A pointer to capture engine instance. Used to start
+  //                  the actual recording.
+  // base_media_type: A pointer to base media type used as a base
+  //                  for the actual video capture media type.
+  // sample_callback: A pointer to capture engine listener.
+  //                  This is set as sample callback for preview sink.
+  bool StartPreview(IMFCaptureEngine* capture_engine,
+                    IMFMediaType* base_media_type,
+                    CaptureEngineListener* sample_callback);
+
+  // Stops existing recording.
+  // Returns false if recording cannot be stopped.
+  //
+  // capture_engine:  A pointer to capture engine instance. Used to stop
+  //                  the ongoing recording.
+  bool StopPreview(IMFCaptureEngine* capture_engine);
+
+  // Set the preview handler recording state to: paused.
+  bool PausePreview();
+
+  // Set the preview handler recording state to: running.
+  bool ResumePreview();
+
+  // Set the preview handler recording state to: running.
+  void OnPreviewStarted();
+
+  // Returns true if preview state is running or paused.
+  bool IsInitialized() const {
+    return preview_state_ == PreviewState::kRunning &&
+           preview_state_ == PreviewState::kPaused;
+  }
+
+  // Returns true if preview state is running.
+  bool IsRunning() const { return preview_state_ == PreviewState::kRunning; }
+
+  // Return true if preview state is paused.
+  bool IsPaused() const { return preview_state_ == PreviewState::kPaused; }
+
+  // Returns true if preview state is starting.
+  bool IsStarting() const { return preview_state_ == PreviewState::kStarting; }
+
+ private:
+  // Initializes record sink for video file capture.
+  HRESULT InitPreviewSink(IMFCaptureEngine* capture_engine,
+                          IMFMediaType* base_media_type,
+                          CaptureEngineListener* sample_callback);
+
+  PreviewState preview_state_ = PreviewState::kNotStarted;
+  ComPtr<IMFCapturePreviewSink> preview_sink_;
+};
+
+}  // namespace camera_windows
+
+#endif  // PACKAGES_CAMERA_CAMERA_WINDOWS_WINDOWS_PREVIEW_HANDLER_H_
diff --git a/packages/camera/camera_windows/windows/record_handler.cpp b/packages/camera/camera_windows/windows/record_handler.cpp
new file mode 100644
index 0000000..1cb258e
--- /dev/null
+++ b/packages/camera/camera_windows/windows/record_handler.cpp
@@ -0,0 +1,260 @@
+// Copyright 2013 The Flutter Authors. All rights reserved.
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+#include "record_handler.h"
+
+#include <mfapi.h>
+#include <mfcaptureengine.h>
+
+#include <cassert>
+
+#include "string_utils.h"
+
+namespace camera_windows {
+
+using Microsoft::WRL::ComPtr;
+
+// Initializes media type for video capture.
+HRESULT BuildMediaTypeForVideoCapture(IMFMediaType* src_media_type,
+                                      IMFMediaType** video_record_media_type,
+                                      GUID capture_format) {
+  assert(src_media_type);
+  ComPtr<IMFMediaType> new_media_type;
+
+  HRESULT hr = MFCreateMediaType(&new_media_type);
+  if (FAILED(hr)) {
+    return hr;
+  }
+
+  // Clones everything from original media type.
+  hr = src_media_type->CopyAllItems(new_media_type.Get());
+  if (FAILED(hr)) {
+    return hr;
+  }
+
+  hr = new_media_type->SetGUID(MF_MT_SUBTYPE, capture_format);
+  if (FAILED(hr)) {
+    return hr;
+  }
+
+  new_media_type.CopyTo(video_record_media_type);
+  return S_OK;
+}
+
+// Queries interface object from collection.
+template <class Q>
+HRESULT GetCollectionObject(IMFCollection* pCollection, DWORD index,
+                            Q** ppObj) {
+  ComPtr<IUnknown> pUnk;
+  HRESULT hr = pCollection->GetElement(index, pUnk.GetAddressOf());
+  if (FAILED(hr)) {
+    return hr;
+  }
+  return pUnk->QueryInterface(IID_PPV_ARGS(ppObj));
+}
+
+// Initializes media type for audo capture.
+HRESULT BuildMediaTypeForAudioCapture(IMFMediaType** audio_record_media_type) {
+  ComPtr<IMFAttributes> audio_output_attributes;
+  ComPtr<IMFMediaType> src_media_type;
+  ComPtr<IMFMediaType> new_media_type;
+  ComPtr<IMFCollection> available_output_types;
+  DWORD mt_count = 0;
+
+  HRESULT hr = MFCreateAttributes(&audio_output_attributes, 1);
+  if (FAILED(hr)) {
+    return hr;
+  }
+
+  // Enumerates only low latency audio outputs.
+  hr = audio_output_attributes->SetUINT32(MF_LOW_LATENCY, TRUE);
+  if (FAILED(hr)) {
+    return hr;
+  }
+
+  DWORD mft_flags = (MFT_ENUM_FLAG_ALL & (~MFT_ENUM_FLAG_FIELDOFUSE)) |
+                    MFT_ENUM_FLAG_SORTANDFILTER;
+
+  hr = MFTranscodeGetAudioOutputAvailableTypes(
+      MFAudioFormat_AAC, mft_flags, audio_output_attributes.Get(),
+      available_output_types.GetAddressOf());
+  if (FAILED(hr)) {
+    return hr;
+  }
+
+  hr = GetCollectionObject(available_output_types.Get(), 0,
+                           src_media_type.GetAddressOf());
+  if (FAILED(hr)) {
+    return hr;
+  }
+
+  hr = available_output_types->GetElementCount(&mt_count);
+  if (FAILED(hr)) {
+    return hr;
+  }
+
+  if (mt_count == 0) {
+    // No sources found, mark process as failure.
+    return E_FAIL;
+  }
+
+  // Create new media type to copy original media type to.
+  hr = MFCreateMediaType(&new_media_type);
+  if (FAILED(hr)) {
+    return hr;
+  }
+
+  hr = src_media_type->CopyAllItems(new_media_type.Get());
+  if (FAILED(hr)) {
+    return hr;
+  }
+
+  new_media_type.CopyTo(audio_record_media_type);
+  return hr;
+}
+
+HRESULT RecordHandler::InitRecordSink(IMFCaptureEngine* capture_engine,
+                                      IMFMediaType* base_media_type) {
+  assert(!file_path_.empty());
+  assert(capture_engine);
+  assert(base_media_type);
+
+  HRESULT hr = S_OK;
+  if (record_sink_) {
+    // If record sink already exists, only update output filename.
+    hr = record_sink_->SetOutputFileName(Utf16FromUtf8(file_path_).c_str());
+
+    if (FAILED(hr)) {
+      record_sink_ = nullptr;
+    }
+    return hr;
+  }
+
+  ComPtr<IMFMediaType> video_record_media_type;
+  ComPtr<IMFCaptureSink> capture_sink;
+
+  // Gets sink from capture engine with record type.
+
+  hr = capture_engine->GetSink(MF_CAPTURE_ENGINE_SINK_TYPE_RECORD,
+                               &capture_sink);
+  if (FAILED(hr)) {
+    return hr;
+  }
+
+  hr = capture_sink.As(&record_sink_);
+  if (FAILED(hr)) {
+    return hr;
+  }
+
+  // Removes existing streams if available.
+  hr = record_sink_->RemoveAllStreams();
+  if (FAILED(hr)) {
+    return hr;
+  }
+
+  hr = BuildMediaTypeForVideoCapture(base_media_type,
+                                     video_record_media_type.GetAddressOf(),
+                                     MFVideoFormat_H264);
+  if (FAILED(hr)) {
+    return hr;
+  }
+
+  DWORD video_record_sink_stream_index;
+  hr = record_sink_->AddStream(
+      (DWORD)MF_CAPTURE_ENGINE_PREFERRED_SOURCE_STREAM_FOR_VIDEO_RECORD,
+      video_record_media_type.Get(), nullptr, &video_record_sink_stream_index);
+  if (FAILED(hr)) {
+    return hr;
+  }
+
+  if (record_audio_) {
+    ComPtr<IMFMediaType> audio_record_media_type;
+    HRESULT audio_capture_hr = S_OK;
+    audio_capture_hr =
+        BuildMediaTypeForAudioCapture(audio_record_media_type.GetAddressOf());
+
+    if (SUCCEEDED(audio_capture_hr)) {
+      DWORD audio_record_sink_stream_index;
+      hr = record_sink_->AddStream(
+          (DWORD)MF_CAPTURE_ENGINE_PREFERRED_SOURCE_STREAM_FOR_AUDIO,
+          audio_record_media_type.Get(), nullptr,
+          &audio_record_sink_stream_index);
+    }
+
+    if (FAILED(hr)) {
+      return hr;
+    }
+  }
+
+  hr = record_sink_->SetOutputFileName(Utf16FromUtf8(file_path_).c_str());
+
+  return hr;
+}
+
+bool RecordHandler::StartRecord(const std::string& file_path,
+                                int64_t max_duration,
+                                IMFCaptureEngine* capture_engine,
+                                IMFMediaType* base_media_type) {
+  assert(!file_path.empty());
+  assert(capture_engine);
+  assert(base_media_type);
+
+  type_ = max_duration < 0 ? RecordingType::kContinuous : RecordingType::kTimed;
+  max_video_duration_ms_ = max_duration;
+  file_path_ = file_path;
+  recording_start_timestamp_us_ = -1;
+  recording_duration_us_ = 0;
+
+  if (FAILED(InitRecordSink(capture_engine, base_media_type))) {
+    return false;
+  }
+
+  recording_state_ = RecordState::kStarting;
+  capture_engine->StartRecord();
+
+  return true;
+}
+
+bool RecordHandler::StopRecord(IMFCaptureEngine* capture_engine) {
+  if (recording_state_ == RecordState::kRunning) {
+    recording_state_ = RecordState::kStopping;
+    HRESULT hr = capture_engine->StopRecord(true, false);
+    return SUCCEEDED(hr);
+  }
+  return false;
+}
+
+void RecordHandler::OnRecordStarted() {
+  if (recording_state_ == RecordState::kStarting) {
+    recording_state_ = RecordState::kRunning;
+  }
+}
+
+void RecordHandler::OnRecordStopped() {
+  if (recording_state_ == RecordState::kStopping) {
+    file_path_ = "";
+    recording_start_timestamp_us_ = -1;
+    recording_duration_us_ = 0;
+    max_video_duration_ms_ = -1;
+    recording_state_ = RecordState::kNotStarted;
+  }
+}
+
+void RecordHandler::UpdateRecordingTime(uint64_t timestamp) {
+  if (recording_start_timestamp_us_ < 0) {
+    recording_start_timestamp_us_ = timestamp;
+  }
+
+  recording_duration_us_ = (timestamp - recording_start_timestamp_us_);
+}
+
+bool RecordHandler::ShouldStopTimedRecording() const {
+  return type_ == RecordingType::kTimed &&
+         recording_state_ == RecordState::kRunning &&
+         max_video_duration_ms_ > 0 &&
+         recording_duration_us_ >=
+             (static_cast<uint64_t>(max_video_duration_ms_) * 1000);
+}
+
+}  // namespace camera_windows
diff --git a/packages/camera/camera_windows/windows/record_handler.h b/packages/camera/camera_windows/windows/record_handler.h
new file mode 100644
index 0000000..0daa7f6
--- /dev/null
+++ b/packages/camera/camera_windows/windows/record_handler.h
@@ -0,0 +1,118 @@
+// Copyright 2013 The Flutter Authors. All rights reserved.
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+#ifndef PACKAGES_CAMERA_CAMERA_WINDOWS_WINDOWS_RECORD_HANDLER_H_
+#define PACKAGES_CAMERA_CAMERA_WINDOWS_WINDOWS_RECORD_HANDLER_H_
+
+#include <mfapi.h>
+#include <mfcaptureengine.h>
+#include <wrl/client.h>
+
+#include <memory>
+#include <string>
+
+namespace camera_windows {
+using Microsoft::WRL::ComPtr;
+
+enum class RecordingType {
+  // Recording continues until it is stopped with a separate stop command.
+  kContinuous,
+  // Recording stops automatically after requested record time is passed.
+  kTimed
+};
+
+// States that the record handler can be in.
+//
+// When created, the handler starts in |kNotStarted| state and transtions in
+// sequential order through the states.
+enum class RecordState { kNotStarted, kStarting, kRunning, kStopping };
+
+// Handler for video recording via the camera.
+//
+// Handles record sink initialization and manages the state of video recording.
+class RecordHandler {
+ public:
+  RecordHandler(bool record_audio) : record_audio_(record_audio) {}
+  virtual ~RecordHandler() = default;
+
+  // Prevent copying.
+  RecordHandler(RecordHandler const&) = delete;
+  RecordHandler& operator=(RecordHandler const&) = delete;
+
+  // Initializes record sink and requests capture engine to start recording.
+  //
+  // Sets record state to: starting.
+  // Returns false if recording cannot be started.
+  //
+  // file_path:       A string that hold file path for video capture.
+  // max_duration:    A int64 value of maximun recording duration.
+  //                  If value is -1 video recording is considered as
+  //                  a continuous recording.
+  // capture_engine:  A pointer to capture engine instance. Used to start
+  //                  the actual recording.
+  // base_media_type: A pointer to base media type used as a base
+  //                  for the actual video capture media type.
+  bool StartRecord(const std::string& file_path, int64_t max_duration,
+                   IMFCaptureEngine* capture_engine,
+                   IMFMediaType* base_media_type);
+
+  // Stops existing recording.
+  // Returns false if recording cannot be stopped.
+  //
+  // capture_engine:  A pointer to capture engine instance. Used to stop
+  //                  the ongoing recording.
+  bool StopRecord(IMFCaptureEngine* capture_engine);
+
+  // Set the record handler recording state to: running.
+  void OnRecordStarted();
+
+  // Resets the record handler state and
+  // sets recording state to: not started.
+  void OnRecordStopped();
+
+  // Returns true if recording type is continuous recording.
+  bool IsContinuousRecording() const {
+    return type_ == RecordingType::kContinuous;
+  }
+
+  // Returns true if recording type is timed recording.
+  bool IsTimedRecording() const { return type_ == RecordingType::kTimed; }
+
+  // Returns true if new recording can be started.
+  bool CanStart() const { return recording_state_ == RecordState::kNotStarted; }
+
+  // Returns true if recording can be stopped.
+  bool CanStop() const { return recording_state_ == RecordState::kRunning; }
+
+  // Returns the filesystem path of the video recording.
+  std::string GetRecordPath() const { return file_path_; }
+
+  // Returns the duration of the video recording in microseconds.
+  uint64_t GetRecordedDuration() const { return recording_duration_us_; }
+
+  // Calculates new recording time from capture timestamp.
+  void UpdateRecordingTime(uint64_t timestamp);
+
+  // Returns true if recording time has exceeded the maximum duration for timed
+  // recordings.
+  bool ShouldStopTimedRecording() const;
+
+ private:
+  // Initializes record sink for video file capture.
+  HRESULT InitRecordSink(IMFCaptureEngine* capture_engine,
+                         IMFMediaType* base_media_type);
+
+  bool record_audio_ = false;
+  int64_t max_video_duration_ms_ = -1;
+  int64_t recording_start_timestamp_us_ = -1;
+  uint64_t recording_duration_us_ = 0;
+  std::string file_path_;
+  RecordState recording_state_ = RecordState::kNotStarted;
+  RecordingType type_;
+  ComPtr<IMFCaptureRecordSink> record_sink_;
+};
+
+}  // namespace camera_windows
+
+#endif  // PACKAGES_CAMERA_CAMERA_WINDOWS_WINDOWS_RECORD_HANDLER_H_
diff --git a/packages/camera/camera_windows/windows/string_utils.cpp b/packages/camera/camera_windows/windows/string_utils.cpp
new file mode 100644
index 0000000..2e60e1b
--- /dev/null
+++ b/packages/camera/camera_windows/windows/string_utils.cpp
@@ -0,0 +1,60 @@
+// Copyright 2013 The Flutter Authors. All rights reserved.
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+#include "string_utils.h"
+
+#include <shobjidl.h>
+#include <windows.h>
+
+#include <string>
+
+namespace camera_windows {
+
+// Converts the given UTF-16 string to UTF-8.
+std::string Utf8FromUtf16(const std::wstring& utf16_string) {
+  if (utf16_string.empty()) {
+    return std::string();
+  }
+  int target_length = ::WideCharToMultiByte(
+      CP_UTF8, WC_ERR_INVALID_CHARS, utf16_string.data(),
+      static_cast<int>(utf16_string.length()), nullptr, 0, nullptr, nullptr);
+  if (target_length == 0) {
+    return std::string();
+  }
+  std::string utf8_string;
+  utf8_string.resize(target_length);
+  int converted_length = ::WideCharToMultiByte(
+      CP_UTF8, WC_ERR_INVALID_CHARS, utf16_string.data(),
+      static_cast<int>(utf16_string.length()), utf8_string.data(),
+      target_length, nullptr, nullptr);
+  if (converted_length == 0) {
+    return std::string();
+  }
+  return utf8_string;
+}
+
+// Converts the given UTF-8 string to UTF-16.
+std::wstring Utf16FromUtf8(const std::string& utf8_string) {
+  if (utf8_string.empty()) {
+    return std::wstring();
+  }
+  int target_length =
+      ::MultiByteToWideChar(CP_UTF8, MB_ERR_INVALID_CHARS, utf8_string.data(),
+                            static_cast<int>(utf8_string.length()), nullptr, 0);
+  if (target_length == 0) {
+    return std::wstring();
+  }
+  std::wstring utf16_string;
+  utf16_string.resize(target_length);
+  int converted_length =
+      ::MultiByteToWideChar(CP_UTF8, MB_ERR_INVALID_CHARS, utf8_string.data(),
+                            static_cast<int>(utf8_string.length()),
+                            utf16_string.data(), target_length);
+  if (converted_length == 0) {
+    return std::wstring();
+  }
+  return utf16_string;
+}
+
+}  // namespace camera_windows
diff --git a/packages/camera/camera_windows/windows/string_utils.h b/packages/camera/camera_windows/windows/string_utils.h
new file mode 100644
index 0000000..562c46a
--- /dev/null
+++ b/packages/camera/camera_windows/windows/string_utils.h
@@ -0,0 +1,22 @@
+// Copyright 2013 The Flutter Authors. All rights reserved.
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+#ifndef PACKAGES_CAMERA_CAMERA_WINDOWS_WINDOWS_STRING_UTILS_H_
+#define PACKAGES_CAMERA_CAMERA_WINDOWS_WINDOWS_STRING_UTILS_H_
+
+#include <shobjidl.h>
+
+#include <string>
+
+namespace camera_windows {
+
+// Converts the given UTF-16 string to UTF-8.
+std::string Utf8FromUtf16(const std::wstring& utf16_string);
+
+// Converts the given UTF-8 string to UTF-16.
+std::wstring Utf16FromUtf8(const std::string& utf8_string);
+
+}  // namespace camera_windows
+
+#endif  // PACKAGES_CAMERA_CAMERA_WINDOWS_WINDOWS_STRING_UTILS_H_
diff --git a/packages/camera/camera_windows/windows/test/camera_plugin_test.cpp b/packages/camera/camera_windows/windows/test/camera_plugin_test.cpp
new file mode 100644
index 0000000..309268a
--- /dev/null
+++ b/packages/camera/camera_windows/windows/test/camera_plugin_test.cpp
@@ -0,0 +1,1010 @@
+// Copyright 2013 The Flutter Authors. All rights reserved.
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+#include "camera_plugin.h"
+
+#include <flutter/method_call.h>
+#include <flutter/method_result_functions.h>
+#include <flutter/standard_method_codec.h>
+#include <flutter/texture_registrar.h>
+#include <gmock/gmock.h>
+#include <gtest/gtest.h>
+#include <windows.h>
+
+#include <functional>
+#include <memory>
+#include <string>
+
+#include "mocks.h"
+
+namespace camera_windows {
+namespace test {
+
+using flutter::EncodableMap;
+using flutter::EncodableValue;
+using ::testing::_;
+using ::testing::DoAll;
+using ::testing::EndsWith;
+using ::testing::Eq;
+using ::testing::Pointee;
+using ::testing::Return;
+
+TEST(CameraPlugin, AvailableCamerasHandlerSuccessIfNoCameras) {
+  std::unique_ptr<MockTextureRegistrar> texture_registrar_ =
+      std::make_unique<MockTextureRegistrar>();
+  std::unique_ptr<MockBinaryMessenger> messenger_ =
+      std::make_unique<MockBinaryMessenger>();
+  std::unique_ptr<MockCameraFactory> camera_factory_ =
+      std::make_unique<MockCameraFactory>();
+  std::unique_ptr<MockMethodResult> result =
+      std::make_unique<MockMethodResult>();
+
+  MockCameraPlugin plugin(texture_registrar_.get(), messenger_.get(),
+                          std::move(camera_factory_));
+
+  EXPECT_CALL(plugin, EnumerateVideoCaptureDeviceSources)
+      .Times(1)
+      .WillOnce([](IMFActivate*** devices, UINT32* count) {
+        *count = 0U;
+        *devices = static_cast<IMFActivate**>(
+            CoTaskMemAlloc(sizeof(IMFActivate*) * (*count)));
+        return true;
+      });
+
+  EXPECT_CALL(*result, ErrorInternal).Times(0);
+  EXPECT_CALL(*result, SuccessInternal).Times(1);
+
+  plugin.HandleMethodCall(
+      flutter::MethodCall("availableCameras",
+                          std::make_unique<EncodableValue>()),
+      std::move(result));
+}
+
+TEST(CameraPlugin, AvailableCamerasHandlerErrorIfFailsToEnumerateDevices) {
+  std::unique_ptr<MockTextureRegistrar> texture_registrar_ =
+      std::make_unique<MockTextureRegistrar>();
+  std::unique_ptr<MockBinaryMessenger> messenger_ =
+      std::make_unique<MockBinaryMessenger>();
+  std::unique_ptr<MockCameraFactory> camera_factory_ =
+      std::make_unique<MockCameraFactory>();
+  std::unique_ptr<MockMethodResult> result =
+      std::make_unique<MockMethodResult>();
+
+  MockCameraPlugin plugin(texture_registrar_.get(), messenger_.get(),
+                          std::move(camera_factory_));
+
+  EXPECT_CALL(plugin, EnumerateVideoCaptureDeviceSources)
+      .Times(1)
+      .WillOnce([](IMFActivate*** devices, UINT32* count) { return false; });
+
+  EXPECT_CALL(*result, ErrorInternal).Times(1);
+  EXPECT_CALL(*result, SuccessInternal).Times(0);
+
+  plugin.HandleMethodCall(
+      flutter::MethodCall("availableCameras",
+                          std::make_unique<EncodableValue>()),
+      std::move(result));
+}
+
+TEST(CameraPlugin, CreateHandlerCallsInitCamera) {
+  std::unique_ptr<MockMethodResult> result =
+      std::make_unique<MockMethodResult>();
+  std::unique_ptr<MockTextureRegistrar> texture_registrar_ =
+      std::make_unique<MockTextureRegistrar>();
+  std::unique_ptr<MockBinaryMessenger> messenger_ =
+      std::make_unique<MockBinaryMessenger>();
+  std::unique_ptr<MockCameraFactory> camera_factory_ =
+      std::make_unique<MockCameraFactory>();
+  std::unique_ptr<MockCamera> camera =
+      std::make_unique<MockCamera>(MOCK_DEVICE_ID);
+
+  EXPECT_CALL(*camera,
+              HasPendingResultByType(Eq(PendingResultType::kCreateCamera)))
+      .Times(1)
+      .WillOnce(Return(false));
+
+  EXPECT_CALL(*camera,
+              AddPendingResult(Eq(PendingResultType::kCreateCamera), _))
+      .Times(1)
+      .WillOnce([cam = camera.get()](PendingResultType type,
+                                     std::unique_ptr<MethodResult<>> result) {
+        cam->pending_result_ = std::move(result);
+        return true;
+      });
+  EXPECT_CALL(*camera, InitCamera)
+      .Times(1)
+      .WillOnce([cam = camera.get()](
+                    flutter::TextureRegistrar* texture_registrar,
+                    flutter::BinaryMessenger* messenger, bool record_audio,
+                    ResolutionPreset resolution_preset) {
+        assert(cam->pending_result_);
+        return cam->pending_result_->Success(EncodableValue(1));
+      });
+
+  // Move mocked camera to the factory to be passed
+  // for plugin with CreateCamera function.
+  camera_factory_->pending_camera_ = std::move(camera);
+
+  EXPECT_CALL(*camera_factory_, CreateCamera(MOCK_DEVICE_ID));
+
+  EXPECT_CALL(*result, ErrorInternal).Times(0);
+  EXPECT_CALL(*result, SuccessInternal(Pointee(EncodableValue(1))));
+
+  CameraPlugin plugin(texture_registrar_.get(), messenger_.get(),
+                      std::move(camera_factory_));
+  EncodableMap args = {
+      {EncodableValue("cameraName"), EncodableValue(MOCK_CAMERA_NAME)},
+      {EncodableValue("resolutionPreset"), EncodableValue(nullptr)},
+      {EncodableValue("enableAudio"), EncodableValue(true)},
+  };
+
+  plugin.HandleMethodCall(
+      flutter::MethodCall("create",
+                          std::make_unique<EncodableValue>(EncodableMap(args))),
+      std::move(result));
+}
+
+TEST(CameraPlugin, CreateHandlerErrorOnInvalidDeviceId) {
+  std::unique_ptr<MockMethodResult> result =
+      std::make_unique<MockMethodResult>();
+  std::unique_ptr<MockTextureRegistrar> texture_registrar_ =
+      std::make_unique<MockTextureRegistrar>();
+  std::unique_ptr<MockBinaryMessenger> messenger_ =
+      std::make_unique<MockBinaryMessenger>();
+  std::unique_ptr<MockCameraFactory> camera_factory_ =
+      std::make_unique<MockCameraFactory>();
+
+  CameraPlugin plugin(texture_registrar_.get(), messenger_.get(),
+                      std::move(camera_factory_));
+  EncodableMap args = {
+      {EncodableValue("cameraName"), EncodableValue(MOCK_INVALID_CAMERA_NAME)},
+      {EncodableValue("resolutionPreset"), EncodableValue(nullptr)},
+      {EncodableValue("enableAudio"), EncodableValue(true)},
+  };
+
+  EXPECT_CALL(*result, ErrorInternal).Times(1);
+
+  plugin.HandleMethodCall(
+      flutter::MethodCall("create",
+                          std::make_unique<EncodableValue>(EncodableMap(args))),
+      std::move(result));
+}
+
+TEST(CameraPlugin, CreateHandlerErrorOnExistingDeviceId) {
+  std::unique_ptr<MockMethodResult> first_create_result =
+      std::make_unique<MockMethodResult>();
+  std::unique_ptr<MockMethodResult> second_create_result =
+      std::make_unique<MockMethodResult>();
+  std::unique_ptr<MockTextureRegistrar> texture_registrar_ =
+      std::make_unique<MockTextureRegistrar>();
+  std::unique_ptr<MockBinaryMessenger> messenger_ =
+      std::make_unique<MockBinaryMessenger>();
+  std::unique_ptr<MockCameraFactory> camera_factory_ =
+      std::make_unique<MockCameraFactory>();
+  std::unique_ptr<MockCamera> camera =
+      std::make_unique<MockCamera>(MOCK_DEVICE_ID);
+
+  EXPECT_CALL(*camera,
+              HasPendingResultByType(Eq(PendingResultType::kCreateCamera)))
+      .Times(1)
+      .WillOnce(Return(false));
+
+  EXPECT_CALL(*camera,
+              AddPendingResult(Eq(PendingResultType::kCreateCamera), _))
+      .Times(1)
+      .WillOnce([cam = camera.get()](PendingResultType type,
+                                     std::unique_ptr<MethodResult<>> result) {
+        cam->pending_result_ = std::move(result);
+        return true;
+      });
+  EXPECT_CALL(*camera, InitCamera)
+      .Times(1)
+      .WillOnce([cam = camera.get()](
+                    flutter::TextureRegistrar* texture_registrar,
+                    flutter::BinaryMessenger* messenger, bool record_audio,
+                    ResolutionPreset resolution_preset) {
+        assert(cam->pending_result_);
+        return cam->pending_result_->Success(EncodableValue(1));
+      });
+
+  EXPECT_CALL(*camera, HasDeviceId(Eq(MOCK_DEVICE_ID)))
+      .Times(1)
+      .WillOnce([cam = camera.get()](std::string& device_id) {
+        return cam->device_id_ == device_id;
+      });
+
+  // Move mocked camera to the factory to be passed
+  // for plugin with CreateCamera function.
+  camera_factory_->pending_camera_ = std::move(camera);
+
+  EXPECT_CALL(*camera_factory_, CreateCamera(MOCK_DEVICE_ID));
+
+  EXPECT_CALL(*first_create_result, ErrorInternal).Times(0);
+  EXPECT_CALL(*first_create_result,
+              SuccessInternal(Pointee(EncodableValue(1))));
+
+  CameraPlugin plugin(texture_registrar_.get(), messenger_.get(),
+                      std::move(camera_factory_));
+  EncodableMap args = {
+      {EncodableValue("cameraName"), EncodableValue(MOCK_CAMERA_NAME)},
+      {EncodableValue("resolutionPreset"), EncodableValue(nullptr)},
+      {EncodableValue("enableAudio"), EncodableValue(true)},
+  };
+
+  plugin.HandleMethodCall(
+      flutter::MethodCall("create",
+                          std::make_unique<EncodableValue>(EncodableMap(args))),
+      std::move(first_create_result));
+
+  EXPECT_CALL(*second_create_result, ErrorInternal).Times(1);
+  EXPECT_CALL(*second_create_result, SuccessInternal).Times(0);
+
+  plugin.HandleMethodCall(
+      flutter::MethodCall("create",
+                          std::make_unique<EncodableValue>(EncodableMap(args))),
+      std::move(second_create_result));
+}
+
+TEST(CameraPlugin, InitializeHandlerCallStartPreview) {
+  int64_t mock_camera_id = 1234;
+
+  std::unique_ptr<MockMethodResult> initialize_result =
+      std::make_unique<MockMethodResult>();
+
+  std::unique_ptr<MockCamera> camera =
+      std::make_unique<MockCamera>(MOCK_DEVICE_ID);
+
+  std::unique_ptr<MockCaptureController> capture_controller =
+      std::make_unique<MockCaptureController>();
+
+  EXPECT_CALL(*camera, HasCameraId(Eq(mock_camera_id)))
+      .Times(1)
+      .WillOnce([cam = camera.get()](int64_t camera_id) {
+        return cam->camera_id_ == camera_id;
+      });
+
+  EXPECT_CALL(*camera,
+              HasPendingResultByType(Eq(PendingResultType::kInitialize)))
+      .Times(1)
+      .WillOnce(Return(false));
+
+  EXPECT_CALL(*camera, AddPendingResult(Eq(PendingResultType::kInitialize), _))
+      .Times(1)
+      .WillOnce([cam = camera.get()](PendingResultType type,
+                                     std::unique_ptr<MethodResult<>> result) {
+        cam->pending_result_ = std::move(result);
+        return true;
+      });
+
+  EXPECT_CALL(*camera, GetCaptureController)
+      .Times(1)
+      .WillOnce([cam = camera.get()]() {
+        assert(cam->pending_result_);
+        return cam->capture_controller_.get();
+      });
+
+  EXPECT_CALL(*capture_controller, StartPreview())
+      .Times(1)
+      .WillOnce([cam = camera.get()]() {
+        assert(cam->pending_result_);
+        return cam->pending_result_->Success();
+      });
+
+  camera->camera_id_ = mock_camera_id;
+  camera->capture_controller_ = std::move(capture_controller);
+
+  MockCameraPlugin plugin(std::make_unique<MockTextureRegistrar>().get(),
+                          std::make_unique<MockBinaryMessenger>().get(),
+                          std::make_unique<MockCameraFactory>());
+
+  // Add mocked camera to plugins camera list.
+  plugin.AddCamera(std::move(camera));
+
+  EXPECT_CALL(*initialize_result, ErrorInternal).Times(0);
+  EXPECT_CALL(*initialize_result, SuccessInternal).Times(1);
+
+  EncodableMap args = {
+      {EncodableValue("cameraId"), EncodableValue(mock_camera_id)},
+  };
+
+  plugin.HandleMethodCall(
+      flutter::MethodCall("initialize",
+                          std::make_unique<EncodableValue>(EncodableMap(args))),
+      std::move(initialize_result));
+}
+
+TEST(CameraPlugin, InitializeHandlerErrorOnInvalidCameraId) {
+  int64_t mock_camera_id = 1234;
+  int64_t missing_camera_id = 5678;
+
+  std::unique_ptr<MockMethodResult> initialize_result =
+      std::make_unique<MockMethodResult>();
+
+  std::unique_ptr<MockCamera> camera =
+      std::make_unique<MockCamera>(MOCK_DEVICE_ID);
+
+  std::unique_ptr<MockCaptureController> capture_controller =
+      std::make_unique<MockCaptureController>();
+
+  EXPECT_CALL(*camera, HasCameraId)
+      .Times(1)
+      .WillOnce([cam = camera.get()](int64_t camera_id) {
+        return cam->camera_id_ == camera_id;
+      });
+
+  EXPECT_CALL(*camera, HasPendingResultByType).Times(0);
+  EXPECT_CALL(*camera, AddPendingResult).Times(0);
+  EXPECT_CALL(*camera, GetCaptureController).Times(0);
+  EXPECT_CALL(*capture_controller, StartPreview).Times(0);
+
+  camera->camera_id_ = mock_camera_id;
+
+  MockCameraPlugin plugin(std::make_unique<MockTextureRegistrar>().get(),
+                          std::make_unique<MockBinaryMessenger>().get(),
+                          std::make_unique<MockCameraFactory>());
+
+  // Add mocked camera to plugins camera list.
+  plugin.AddCamera(std::move(camera));
+
+  EXPECT_CALL(*initialize_result, ErrorInternal).Times(1);
+  EXPECT_CALL(*initialize_result, SuccessInternal).Times(0);
+
+  EncodableMap args = {
+      {EncodableValue("cameraId"), EncodableValue(missing_camera_id)},
+  };
+
+  plugin.HandleMethodCall(
+      flutter::MethodCall("initialize",
+                          std::make_unique<EncodableValue>(EncodableMap(args))),
+      std::move(initialize_result));
+}
+
+TEST(CameraPlugin, TakePictureHandlerCallsTakePictureWithPath) {
+  int64_t mock_camera_id = 1234;
+
+  std::unique_ptr<MockMethodResult> initialize_result =
+      std::make_unique<MockMethodResult>();
+
+  std::unique_ptr<MockCamera> camera =
+      std::make_unique<MockCamera>(MOCK_DEVICE_ID);
+
+  std::unique_ptr<MockCaptureController> capture_controller =
+      std::make_unique<MockCaptureController>();
+
+  EXPECT_CALL(*camera, HasCameraId(Eq(mock_camera_id)))
+      .Times(1)
+      .WillOnce([cam = camera.get()](int64_t camera_id) {
+        return cam->camera_id_ == camera_id;
+      });
+
+  EXPECT_CALL(*camera,
+              HasPendingResultByType(Eq(PendingResultType::kTakePicture)))
+      .Times(1)
+      .WillOnce(Return(false));
+
+  EXPECT_CALL(*camera, AddPendingResult(Eq(PendingResultType::kTakePicture), _))
+      .Times(1)
+      .WillOnce([cam = camera.get()](PendingResultType type,
+                                     std::unique_ptr<MethodResult<>> result) {
+        cam->pending_result_ = std::move(result);
+        return true;
+      });
+
+  EXPECT_CALL(*camera, GetCaptureController)
+      .Times(1)
+      .WillOnce([cam = camera.get()]() {
+        assert(cam->pending_result_);
+        return cam->capture_controller_.get();
+      });
+
+  EXPECT_CALL(*capture_controller, TakePicture(EndsWith(".jpeg")))
+      .Times(1)
+      .WillOnce([cam = camera.get()](const std::string& file_path) {
+        assert(cam->pending_result_);
+        return cam->pending_result_->Success();
+      });
+
+  camera->camera_id_ = mock_camera_id;
+  camera->capture_controller_ = std::move(capture_controller);
+
+  MockCameraPlugin plugin(std::make_unique<MockTextureRegistrar>().get(),
+                          std::make_unique<MockBinaryMessenger>().get(),
+                          std::make_unique<MockCameraFactory>());
+
+  // Add mocked camera to plugins camera list.
+  plugin.AddCamera(std::move(camera));
+
+  EXPECT_CALL(*initialize_result, ErrorInternal).Times(0);
+  EXPECT_CALL(*initialize_result, SuccessInternal).Times(1);
+
+  EncodableMap args = {
+      {EncodableValue("cameraId"), EncodableValue(mock_camera_id)},
+  };
+
+  plugin.HandleMethodCall(
+      flutter::MethodCall("takePicture",
+                          std::make_unique<EncodableValue>(EncodableMap(args))),
+      std::move(initialize_result));
+}
+
+TEST(CameraPlugin, TakePictureHandlerErrorOnInvalidCameraId) {
+  int64_t mock_camera_id = 1234;
+  int64_t missing_camera_id = 5678;
+
+  std::unique_ptr<MockMethodResult> initialize_result =
+      std::make_unique<MockMethodResult>();
+
+  std::unique_ptr<MockCamera> camera =
+      std::make_unique<MockCamera>(MOCK_DEVICE_ID);
+
+  std::unique_ptr<MockCaptureController> capture_controller =
+      std::make_unique<MockCaptureController>();
+
+  EXPECT_CALL(*camera, HasCameraId)
+      .Times(1)
+      .WillOnce([cam = camera.get()](int64_t camera_id) {
+        return cam->camera_id_ == camera_id;
+      });
+
+  EXPECT_CALL(*camera, HasPendingResultByType).Times(0);
+  EXPECT_CALL(*camera, AddPendingResult).Times(0);
+  EXPECT_CALL(*camera, GetCaptureController).Times(0);
+  EXPECT_CALL(*capture_controller, TakePicture).Times(0);
+
+  camera->camera_id_ = mock_camera_id;
+
+  MockCameraPlugin plugin(std::make_unique<MockTextureRegistrar>().get(),
+                          std::make_unique<MockBinaryMessenger>().get(),
+                          std::make_unique<MockCameraFactory>());
+
+  // Add mocked camera to plugins camera list.
+  plugin.AddCamera(std::move(camera));
+
+  EXPECT_CALL(*initialize_result, ErrorInternal).Times(1);
+  EXPECT_CALL(*initialize_result, SuccessInternal).Times(0);
+
+  EncodableMap args = {
+      {EncodableValue("cameraId"), EncodableValue(missing_camera_id)},
+  };
+
+  plugin.HandleMethodCall(
+      flutter::MethodCall("takePicture",
+                          std::make_unique<EncodableValue>(EncodableMap(args))),
+      std::move(initialize_result));
+}
+
+TEST(CameraPlugin, StartVideoRecordingHandlerCallsStartRecordWithPath) {
+  int64_t mock_camera_id = 1234;
+
+  std::unique_ptr<MockMethodResult> initialize_result =
+      std::make_unique<MockMethodResult>();
+
+  std::unique_ptr<MockCamera> camera =
+      std::make_unique<MockCamera>(MOCK_DEVICE_ID);
+
+  std::unique_ptr<MockCaptureController> capture_controller =
+      std::make_unique<MockCaptureController>();
+
+  EXPECT_CALL(*camera, HasCameraId(Eq(mock_camera_id)))
+      .Times(1)
+      .WillOnce([cam = camera.get()](int64_t camera_id) {
+        return cam->camera_id_ == camera_id;
+      });
+
+  EXPECT_CALL(*camera,
+              HasPendingResultByType(Eq(PendingResultType::kStartRecord)))
+      .Times(1)
+      .WillOnce(Return(false));
+
+  EXPECT_CALL(*camera, AddPendingResult(Eq(PendingResultType::kStartRecord), _))
+      .Times(1)
+      .WillOnce([cam = camera.get()](PendingResultType type,
+                                     std::unique_ptr<MethodResult<>> result) {
+        cam->pending_result_ = std::move(result);
+        return true;
+      });
+
+  EXPECT_CALL(*camera, GetCaptureController)
+      .Times(1)
+      .WillOnce([cam = camera.get()]() {
+        assert(cam->pending_result_);
+        return cam->capture_controller_.get();
+      });
+
+  EXPECT_CALL(*capture_controller, StartRecord(EndsWith(".mp4"), -1))
+      .Times(1)
+      .WillOnce([cam = camera.get()](const std::string& file_path,
+                                     int64_t max_video_duration_ms) {
+        assert(cam->pending_result_);
+        return cam->pending_result_->Success();
+      });
+
+  camera->camera_id_ = mock_camera_id;
+  camera->capture_controller_ = std::move(capture_controller);
+
+  MockCameraPlugin plugin(std::make_unique<MockTextureRegistrar>().get(),
+                          std::make_unique<MockBinaryMessenger>().get(),
+                          std::make_unique<MockCameraFactory>());
+
+  // Add mocked camera to plugins camera list.
+  plugin.AddCamera(std::move(camera));
+
+  EXPECT_CALL(*initialize_result, ErrorInternal).Times(0);
+  EXPECT_CALL(*initialize_result, SuccessInternal).Times(1);
+
+  EncodableMap args = {
+      {EncodableValue("cameraId"), EncodableValue(mock_camera_id)},
+  };
+
+  plugin.HandleMethodCall(
+      flutter::MethodCall("startVideoRecording",
+                          std::make_unique<EncodableValue>(EncodableMap(args))),
+      std::move(initialize_result));
+}
+
+TEST(CameraPlugin,
+     StartVideoRecordingHandlerCallsStartRecordWithPathAndCaptureDuration) {
+  int64_t mock_camera_id = 1234;
+  int32_t mock_video_duration = 100000;
+
+  std::unique_ptr<MockMethodResult> initialize_result =
+      std::make_unique<MockMethodResult>();
+
+  std::unique_ptr<MockCamera> camera =
+      std::make_unique<MockCamera>(MOCK_DEVICE_ID);
+
+  std::unique_ptr<MockCaptureController> capture_controller =
+      std::make_unique<MockCaptureController>();
+
+  EXPECT_CALL(*camera, HasCameraId(Eq(mock_camera_id)))
+      .Times(1)
+      .WillOnce([cam = camera.get()](int64_t camera_id) {
+        return cam->camera_id_ == camera_id;
+      });
+
+  EXPECT_CALL(*camera,
+              HasPendingResultByType(Eq(PendingResultType::kStartRecord)))
+      .Times(1)
+      .WillOnce(Return(false));
+
+  EXPECT_CALL(*camera, AddPendingResult(Eq(PendingResultType::kStartRecord), _))
+      .Times(1)
+      .WillOnce([cam = camera.get()](PendingResultType type,
+                                     std::unique_ptr<MethodResult<>> result) {
+        cam->pending_result_ = std::move(result);
+        return true;
+      });
+
+  EXPECT_CALL(*camera, GetCaptureController)
+      .Times(1)
+      .WillOnce([cam = camera.get()]() {
+        assert(cam->pending_result_);
+        return cam->capture_controller_.get();
+      });
+
+  EXPECT_CALL(*capture_controller,
+              StartRecord(EndsWith(".mp4"), Eq(mock_video_duration)))
+      .Times(1)
+      .WillOnce([cam = camera.get()](const std::string& file_path,
+                                     int64_t max_video_duration_ms) {
+        assert(cam->pending_result_);
+        return cam->pending_result_->Success();
+      });
+
+  camera->camera_id_ = mock_camera_id;
+  camera->capture_controller_ = std::move(capture_controller);
+
+  MockCameraPlugin plugin(std::make_unique<MockTextureRegistrar>().get(),
+                          std::make_unique<MockBinaryMessenger>().get(),
+                          std::make_unique<MockCameraFactory>());
+
+  // Add mocked camera to plugins camera list.
+  plugin.AddCamera(std::move(camera));
+
+  EXPECT_CALL(*initialize_result, ErrorInternal).Times(0);
+  EXPECT_CALL(*initialize_result, SuccessInternal).Times(1);
+
+  EncodableMap args = {
+      {EncodableValue("cameraId"), EncodableValue(mock_camera_id)},
+      {EncodableValue("maxVideoDuration"), EncodableValue(mock_video_duration)},
+  };
+
+  plugin.HandleMethodCall(
+      flutter::MethodCall("startVideoRecording",
+                          std::make_unique<EncodableValue>(EncodableMap(args))),
+      std::move(initialize_result));
+}
+
+TEST(CameraPlugin, StartVideoRecordingHandlerErrorOnInvalidCameraId) {
+  int64_t mock_camera_id = 1234;
+  int64_t missing_camera_id = 5678;
+
+  std::unique_ptr<MockMethodResult> initialize_result =
+      std::make_unique<MockMethodResult>();
+
+  std::unique_ptr<MockCamera> camera =
+      std::make_unique<MockCamera>(MOCK_DEVICE_ID);
+
+  std::unique_ptr<MockCaptureController> capture_controller =
+      std::make_unique<MockCaptureController>();
+
+  EXPECT_CALL(*camera, HasCameraId)
+      .Times(1)
+      .WillOnce([cam = camera.get()](int64_t camera_id) {
+        return cam->camera_id_ == camera_id;
+      });
+
+  EXPECT_CALL(*camera, HasPendingResultByType).Times(0);
+  EXPECT_CALL(*camera, AddPendingResult).Times(0);
+  EXPECT_CALL(*camera, GetCaptureController).Times(0);
+  EXPECT_CALL(*capture_controller, StartRecord(_, -1)).Times(0);
+
+  camera->camera_id_ = mock_camera_id;
+
+  MockCameraPlugin plugin(std::make_unique<MockTextureRegistrar>().get(),
+                          std::make_unique<MockBinaryMessenger>().get(),
+                          std::make_unique<MockCameraFactory>());
+
+  // Add mocked camera to plugins camera list.
+  plugin.AddCamera(std::move(camera));
+
+  EXPECT_CALL(*initialize_result, ErrorInternal).Times(1);
+  EXPECT_CALL(*initialize_result, SuccessInternal).Times(0);
+
+  EncodableMap args = {
+      {EncodableValue("cameraId"), EncodableValue(missing_camera_id)},
+  };
+
+  plugin.HandleMethodCall(
+      flutter::MethodCall("startVideoRecording",
+                          std::make_unique<EncodableValue>(EncodableMap(args))),
+      std::move(initialize_result));
+}
+
+TEST(CameraPlugin, StopVideoRecordingHandlerCallsStopRecord) {
+  int64_t mock_camera_id = 1234;
+
+  std::unique_ptr<MockMethodResult> initialize_result =
+      std::make_unique<MockMethodResult>();
+
+  std::unique_ptr<MockCamera> camera =
+      std::make_unique<MockCamera>(MOCK_DEVICE_ID);
+
+  std::unique_ptr<MockCaptureController> capture_controller =
+      std::make_unique<MockCaptureController>();
+
+  EXPECT_CALL(*camera, HasCameraId(Eq(mock_camera_id)))
+      .Times(1)
+      .WillOnce([cam = camera.get()](int64_t camera_id) {
+        return cam->camera_id_ == camera_id;
+      });
+
+  EXPECT_CALL(*camera,
+              HasPendingResultByType(Eq(PendingResultType::kStopRecord)))
+      .Times(1)
+      .WillOnce(Return(false));
+
+  EXPECT_CALL(*camera, AddPendingResult(Eq(PendingResultType::kStopRecord), _))
+      .Times(1)
+      .WillOnce([cam = camera.get()](PendingResultType type,
+                                     std::unique_ptr<MethodResult<>> result) {
+        cam->pending_result_ = std::move(result);
+        return true;
+      });
+
+  EXPECT_CALL(*camera, GetCaptureController)
+      .Times(1)
+      .WillOnce([cam = camera.get()]() {
+        assert(cam->pending_result_);
+        return cam->capture_controller_.get();
+      });
+
+  EXPECT_CALL(*capture_controller, StopRecord)
+      .Times(1)
+      .WillOnce([cam = camera.get()]() {
+        assert(cam->pending_result_);
+        return cam->pending_result_->Success();
+      });
+
+  camera->camera_id_ = mock_camera_id;
+  camera->capture_controller_ = std::move(capture_controller);
+
+  MockCameraPlugin plugin(std::make_unique<MockTextureRegistrar>().get(),
+                          std::make_unique<MockBinaryMessenger>().get(),
+                          std::make_unique<MockCameraFactory>());
+
+  // Add mocked camera to plugins camera list.
+  plugin.AddCamera(std::move(camera));
+
+  EXPECT_CALL(*initialize_result, ErrorInternal).Times(0);
+  EXPECT_CALL(*initialize_result, SuccessInternal).Times(1);
+
+  EncodableMap args = {
+      {EncodableValue("cameraId"), EncodableValue(mock_camera_id)},
+  };
+
+  plugin.HandleMethodCall(
+      flutter::MethodCall("stopVideoRecording",
+                          std::make_unique<EncodableValue>(EncodableMap(args))),
+      std::move(initialize_result));
+}
+
+TEST(CameraPlugin, StopVideoRecordingHandlerErrorOnInvalidCameraId) {
+  int64_t mock_camera_id = 1234;
+  int64_t missing_camera_id = 5678;
+
+  std::unique_ptr<MockMethodResult> initialize_result =
+      std::make_unique<MockMethodResult>();
+
+  std::unique_ptr<MockCamera> camera =
+      std::make_unique<MockCamera>(MOCK_DEVICE_ID);
+
+  std::unique_ptr<MockCaptureController> capture_controller =
+      std::make_unique<MockCaptureController>();
+
+  EXPECT_CALL(*camera, HasCameraId)
+      .Times(1)
+      .WillOnce([cam = camera.get()](int64_t camera_id) {
+        return cam->camera_id_ == camera_id;
+      });
+
+  EXPECT_CALL(*camera, HasPendingResultByType).Times(0);
+  EXPECT_CALL(*camera, AddPendingResult).Times(0);
+  EXPECT_CALL(*camera, GetCaptureController).Times(0);
+  EXPECT_CALL(*capture_controller, StopRecord).Times(0);
+
+  camera->camera_id_ = mock_camera_id;
+
+  MockCameraPlugin plugin(std::make_unique<MockTextureRegistrar>().get(),
+                          std::make_unique<MockBinaryMessenger>().get(),
+                          std::make_unique<MockCameraFactory>());
+
+  // Add mocked camera to plugins camera list.
+  plugin.AddCamera(std::move(camera));
+
+  EXPECT_CALL(*initialize_result, ErrorInternal).Times(1);
+  EXPECT_CALL(*initialize_result, SuccessInternal).Times(0);
+
+  EncodableMap args = {
+      {EncodableValue("cameraId"), EncodableValue(missing_camera_id)},
+  };
+
+  plugin.HandleMethodCall(
+      flutter::MethodCall("stopVideoRecording",
+                          std::make_unique<EncodableValue>(EncodableMap(args))),
+      std::move(initialize_result));
+}
+
+TEST(CameraPlugin, ResumePreviewHandlerCallsResumePreview) {
+  int64_t mock_camera_id = 1234;
+
+  std::unique_ptr<MockMethodResult> initialize_result =
+      std::make_unique<MockMethodResult>();
+
+  std::unique_ptr<MockCamera> camera =
+      std::make_unique<MockCamera>(MOCK_DEVICE_ID);
+
+  std::unique_ptr<MockCaptureController> capture_controller =
+      std::make_unique<MockCaptureController>();
+
+  EXPECT_CALL(*camera, HasCameraId(Eq(mock_camera_id)))
+      .Times(1)
+      .WillOnce([cam = camera.get()](int64_t camera_id) {
+        return cam->camera_id_ == camera_id;
+      });
+
+  EXPECT_CALL(*camera,
+              HasPendingResultByType(Eq(PendingResultType::kResumePreview)))
+      .Times(1)
+      .WillOnce(Return(false));
+
+  EXPECT_CALL(*camera,
+              AddPendingResult(Eq(PendingResultType::kResumePreview), _))
+      .Times(1)
+      .WillOnce([cam = camera.get()](PendingResultType type,
+                                     std::unique_ptr<MethodResult<>> result) {
+        cam->pending_result_ = std::move(result);
+        return true;
+      });
+
+  EXPECT_CALL(*camera, GetCaptureController)
+      .Times(1)
+      .WillOnce([cam = camera.get()]() {
+        assert(cam->pending_result_);
+        return cam->capture_controller_.get();
+      });
+
+  EXPECT_CALL(*capture_controller, ResumePreview)
+      .Times(1)
+      .WillOnce([cam = camera.get()]() {
+        assert(cam->pending_result_);
+        return cam->pending_result_->Success();
+      });
+
+  camera->camera_id_ = mock_camera_id;
+  camera->capture_controller_ = std::move(capture_controller);
+
+  MockCameraPlugin plugin(std::make_unique<MockTextureRegistrar>().get(),
+                          std::make_unique<MockBinaryMessenger>().get(),
+                          std::make_unique<MockCameraFactory>());
+
+  // Add mocked camera to plugins camera list.
+  plugin.AddCamera(std::move(camera));
+
+  EXPECT_CALL(*initialize_result, ErrorInternal).Times(0);
+  EXPECT_CALL(*initialize_result, SuccessInternal).Times(1);
+
+  EncodableMap args = {
+      {EncodableValue("cameraId"), EncodableValue(mock_camera_id)},
+  };
+
+  plugin.HandleMethodCall(
+      flutter::MethodCall("resumePreview",
+                          std::make_unique<EncodableValue>(EncodableMap(args))),
+      std::move(initialize_result));
+}
+
+TEST(CameraPlugin, ResumePreviewHandlerErrorOnInvalidCameraId) {
+  int64_t mock_camera_id = 1234;
+  int64_t missing_camera_id = 5678;
+
+  std::unique_ptr<MockMethodResult> initialize_result =
+      std::make_unique<MockMethodResult>();
+
+  std::unique_ptr<MockCamera> camera =
+      std::make_unique<MockCamera>(MOCK_DEVICE_ID);
+
+  std::unique_ptr<MockCaptureController> capture_controller =
+      std::make_unique<MockCaptureController>();
+
+  EXPECT_CALL(*camera, HasCameraId)
+      .Times(1)
+      .WillOnce([cam = camera.get()](int64_t camera_id) {
+        return cam->camera_id_ == camera_id;
+      });
+
+  EXPECT_CALL(*camera, HasPendingResultByType).Times(0);
+  EXPECT_CALL(*camera, AddPendingResult).Times(0);
+  EXPECT_CALL(*camera, GetCaptureController).Times(0);
+  EXPECT_CALL(*capture_controller, ResumePreview).Times(0);
+
+  camera->camera_id_ = mock_camera_id;
+
+  MockCameraPlugin plugin(std::make_unique<MockTextureRegistrar>().get(),
+                          std::make_unique<MockBinaryMessenger>().get(),
+                          std::make_unique<MockCameraFactory>());
+
+  // Add mocked camera to plugins camera list.
+  plugin.AddCamera(std::move(camera));
+
+  EXPECT_CALL(*initialize_result, ErrorInternal).Times(1);
+  EXPECT_CALL(*initialize_result, SuccessInternal).Times(0);
+
+  EncodableMap args = {
+      {EncodableValue("cameraId"), EncodableValue(missing_camera_id)},
+  };
+
+  plugin.HandleMethodCall(
+      flutter::MethodCall("resumePreview",
+                          std::make_unique<EncodableValue>(EncodableMap(args))),
+      std::move(initialize_result));
+}
+
+TEST(CameraPlugin, PausePreviewHandlerCallsPausePreview) {
+  int64_t mock_camera_id = 1234;
+
+  std::unique_ptr<MockMethodResult> initialize_result =
+      std::make_unique<MockMethodResult>();
+
+  std::unique_ptr<MockCamera> camera =
+      std::make_unique<MockCamera>(MOCK_DEVICE_ID);
+
+  std::unique_ptr<MockCaptureController> capture_controller =
+      std::make_unique<MockCaptureController>();
+
+  EXPECT_CALL(*camera, HasCameraId(Eq(mock_camera_id)))
+      .Times(1)
+      .WillOnce([cam = camera.get()](int64_t camera_id) {
+        return cam->camera_id_ == camera_id;
+      });
+
+  EXPECT_CALL(*camera,
+              HasPendingResultByType(Eq(PendingResultType::kPausePreview)))
+      .Times(1)
+      .WillOnce(Return(false));
+
+  EXPECT_CALL(*camera,
+              AddPendingResult(Eq(PendingResultType::kPausePreview), _))
+      .Times(1)
+      .WillOnce([cam = camera.get()](PendingResultType type,
+                                     std::unique_ptr<MethodResult<>> result) {
+        cam->pending_result_ = std::move(result);
+        return true;
+      });
+
+  EXPECT_CALL(*camera, GetCaptureController)
+      .Times(1)
+      .WillOnce([cam = camera.get()]() {
+        assert(cam->pending_result_);
+        return cam->capture_controller_.get();
+      });
+
+  EXPECT_CALL(*capture_controller, PausePreview)
+      .Times(1)
+      .WillOnce([cam = camera.get()]() {
+        assert(cam->pending_result_);
+        return cam->pending_result_->Success();
+      });
+
+  camera->camera_id_ = mock_camera_id;
+  camera->capture_controller_ = std::move(capture_controller);
+
+  MockCameraPlugin plugin(std::make_unique<MockTextureRegistrar>().get(),
+                          std::make_unique<MockBinaryMessenger>().get(),
+                          std::make_unique<MockCameraFactory>());
+
+  // Add mocked camera to plugins camera list.
+  plugin.AddCamera(std::move(camera));
+
+  EXPECT_CALL(*initialize_result, ErrorInternal).Times(0);
+  EXPECT_CALL(*initialize_result, SuccessInternal).Times(1);
+
+  EncodableMap args = {
+      {EncodableValue("cameraId"), EncodableValue(mock_camera_id)},
+  };
+
+  plugin.HandleMethodCall(
+      flutter::MethodCall("pausePreview",
+                          std::make_unique<EncodableValue>(EncodableMap(args))),
+      std::move(initialize_result));
+}
+
+TEST(CameraPlugin, PausePreviewHandlerErrorOnInvalidCameraId) {
+  int64_t mock_camera_id = 1234;
+  int64_t missing_camera_id = 5678;
+
+  std::unique_ptr<MockMethodResult> initialize_result =
+      std::make_unique<MockMethodResult>();
+
+  std::unique_ptr<MockCamera> camera =
+      std::make_unique<MockCamera>(MOCK_DEVICE_ID);
+
+  std::unique_ptr<MockCaptureController> capture_controller =
+      std::make_unique<MockCaptureController>();
+
+  EXPECT_CALL(*camera, HasCameraId)
+      .Times(1)
+      .WillOnce([cam = camera.get()](int64_t camera_id) {
+        return cam->camera_id_ == camera_id;
+      });
+
+  EXPECT_CALL(*camera, HasPendingResultByType).Times(0);
+  EXPECT_CALL(*camera, AddPendingResult).Times(0);
+  EXPECT_CALL(*camera, GetCaptureController).Times(0);
+  EXPECT_CALL(*capture_controller, PausePreview).Times(0);
+
+  camera->camera_id_ = mock_camera_id;
+
+  MockCameraPlugin plugin(std::make_unique<MockTextureRegistrar>().get(),
+                          std::make_unique<MockBinaryMessenger>().get(),
+                          std::make_unique<MockCameraFactory>());
+
+  // Add mocked camera to plugins camera list.
+  plugin.AddCamera(std::move(camera));
+
+  EXPECT_CALL(*initialize_result, ErrorInternal).Times(1);
+  EXPECT_CALL(*initialize_result, SuccessInternal).Times(0);
+
+  EncodableMap args = {
+      {EncodableValue("cameraId"), EncodableValue(missing_camera_id)},
+  };
+
+  plugin.HandleMethodCall(
+      flutter::MethodCall("pausePreview",
+                          std::make_unique<EncodableValue>(EncodableMap(args))),
+      std::move(initialize_result));
+}
+
+}  // namespace test
+}  // namespace camera_windows
diff --git a/packages/camera/camera_windows/windows/test/camera_test.cpp b/packages/camera/camera_windows/windows/test/camera_test.cpp
new file mode 100644
index 0000000..899c1fd
--- /dev/null
+++ b/packages/camera/camera_windows/windows/test/camera_test.cpp
@@ -0,0 +1,344 @@
+// Copyright 2013 The Flutter Authors. All rights reserved.
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+#include "camera.h"
+
+#include <flutter/method_call.h>
+#include <flutter/method_result_functions.h>
+#include <flutter/standard_method_codec.h>
+#include <flutter/texture_registrar.h>
+#include <gmock/gmock.h>
+#include <gtest/gtest.h>
+#include <windows.h>
+
+#include <functional>
+#include <memory>
+#include <string>
+
+#include "mocks.h"
+
+namespace camera_windows {
+using ::testing::_;
+using ::testing::Eq;
+using ::testing::NiceMock;
+using ::testing::Pointee;
+
+namespace test {
+
+TEST(Camera, InitCameraCreatesCaptureController) {
+  std::unique_ptr<CameraImpl> camera =
+      std::make_unique<CameraImpl>(MOCK_DEVICE_ID);
+  std::unique_ptr<MockCaptureControllerFactory> capture_controller_factory =
+      std::make_unique<MockCaptureControllerFactory>();
+
+  EXPECT_CALL(*capture_controller_factory, CreateCaptureController)
+      .Times(1)
+      .WillOnce(
+          []() { return std::make_unique<NiceMock<MockCaptureController>>(); });
+
+  EXPECT_TRUE(camera->GetCaptureController() == nullptr);
+
+  // Init camera with mock capture controller factory
+  camera->InitCamera(std::move(capture_controller_factory),
+                     std::make_unique<MockTextureRegistrar>().get(),
+                     std::make_unique<MockBinaryMessenger>().get(), false,
+                     ResolutionPreset::kAuto);
+
+  EXPECT_TRUE(camera->GetCaptureController() != nullptr);
+}
+
+TEST(Camera, AddPendingResultReturnsErrorForDuplicates) {
+  std::unique_ptr<CameraImpl> camera =
+      std::make_unique<CameraImpl>(MOCK_DEVICE_ID);
+  std::unique_ptr<MockMethodResult> first_pending_result =
+      std::make_unique<MockMethodResult>();
+  std::unique_ptr<MockMethodResult> second_pending_result =
+      std::make_unique<MockMethodResult>();
+
+  EXPECT_CALL(*first_pending_result, ErrorInternal).Times(0);
+  EXPECT_CALL(*first_pending_result, SuccessInternal);
+  EXPECT_CALL(*second_pending_result, ErrorInternal).Times(1);
+
+  camera->AddPendingResult(PendingResultType::kCreateCamera,
+                           std::move(first_pending_result));
+
+  // This should fail
+  camera->AddPendingResult(PendingResultType::kCreateCamera,
+                           std::move(second_pending_result));
+
+  // Mark pending result as succeeded
+  camera->OnCreateCaptureEngineSucceeded(0);
+}
+
+TEST(Camera, OnCreateCaptureEngineSucceededReturnsCameraId) {
+  std::unique_ptr<CameraImpl> camera =
+      std::make_unique<CameraImpl>(MOCK_DEVICE_ID);
+  std::unique_ptr<MockMethodResult> result =
+      std::make_unique<MockMethodResult>();
+
+  const int64_t texture_id = 12345;
+
+  EXPECT_CALL(*result, ErrorInternal).Times(0);
+  EXPECT_CALL(
+      *result,
+      SuccessInternal(Pointee(EncodableValue(EncodableMap(
+          {{EncodableValue("cameraId"), EncodableValue(texture_id)}})))));
+
+  camera->AddPendingResult(PendingResultType::kCreateCamera, std::move(result));
+
+  camera->OnCreateCaptureEngineSucceeded(texture_id);
+}
+
+TEST(Camera, OnCreateCaptureEngineFailedReturnsError) {
+  std::unique_ptr<CameraImpl> camera =
+      std::make_unique<CameraImpl>(MOCK_DEVICE_ID);
+  std::unique_ptr<MockMethodResult> result =
+      std::make_unique<MockMethodResult>();
+
+  std::string error_text = "error_text";
+
+  EXPECT_CALL(*result, SuccessInternal).Times(0);
+  EXPECT_CALL(*result, ErrorInternal(_, Eq(error_text), _));
+
+  camera->AddPendingResult(PendingResultType::kCreateCamera, std::move(result));
+
+  camera->OnCreateCaptureEngineFailed(error_text);
+}
+
+TEST(Camera, OnStartPreviewSucceededReturnsFrameSize) {
+  std::unique_ptr<CameraImpl> camera =
+      std::make_unique<CameraImpl>(MOCK_DEVICE_ID);
+  std::unique_ptr<MockMethodResult> result =
+      std::make_unique<MockMethodResult>();
+
+  const int32_t width = 123;
+  const int32_t height = 456;
+
+  EXPECT_CALL(*result, ErrorInternal).Times(0);
+  EXPECT_CALL(
+      *result,
+      SuccessInternal(Pointee(EncodableValue(EncodableMap({
+          {EncodableValue("previewWidth"), EncodableValue((float)width)},
+          {EncodableValue("previewHeight"), EncodableValue((float)height)},
+      })))));
+
+  camera->AddPendingResult(PendingResultType::kInitialize, std::move(result));
+
+  camera->OnStartPreviewSucceeded(width, height);
+}
+
+TEST(Camera, OnStartPreviewFailedReturnsError) {
+  std::unique_ptr<CameraImpl> camera =
+      std::make_unique<CameraImpl>(MOCK_DEVICE_ID);
+  std::unique_ptr<MockMethodResult> result =
+      std::make_unique<MockMethodResult>();
+
+  std::string error_text = "error_text";
+
+  EXPECT_CALL(*result, SuccessInternal).Times(0);
+  EXPECT_CALL(*result, ErrorInternal(_, Eq(error_text), _));
+
+  camera->AddPendingResult(PendingResultType::kInitialize, std::move(result));
+
+  camera->OnStartPreviewFailed(error_text);
+}
+
+TEST(Camera, OnPausePreviewSucceededReturnsSuccess) {
+  std::unique_ptr<CameraImpl> camera =
+      std::make_unique<CameraImpl>(MOCK_DEVICE_ID);
+  std::unique_ptr<MockMethodResult> result =
+      std::make_unique<MockMethodResult>();
+
+  EXPECT_CALL(*result, ErrorInternal).Times(0);
+  EXPECT_CALL(*result, SuccessInternal(nullptr));
+
+  camera->AddPendingResult(PendingResultType::kPausePreview, std::move(result));
+
+  camera->OnPausePreviewSucceeded();
+}
+
+TEST(Camera, OnPausePreviewFailedReturnsError) {
+  std::unique_ptr<CameraImpl> camera =
+      std::make_unique<CameraImpl>(MOCK_DEVICE_ID);
+  std::unique_ptr<MockMethodResult> result =
+      std::make_unique<MockMethodResult>();
+
+  std::string error_text = "error_text";
+
+  EXPECT_CALL(*result, SuccessInternal).Times(0);
+  EXPECT_CALL(*result, ErrorInternal(_, Eq(error_text), _));
+
+  camera->AddPendingResult(PendingResultType::kPausePreview, std::move(result));
+
+  camera->OnPausePreviewFailed(error_text);
+}
+
+TEST(Camera, OnResumePreviewSucceededReturnsSuccess) {
+  std::unique_ptr<CameraImpl> camera =
+      std::make_unique<CameraImpl>(MOCK_DEVICE_ID);
+  std::unique_ptr<MockMethodResult> result =
+      std::make_unique<MockMethodResult>();
+
+  EXPECT_CALL(*result, ErrorInternal).Times(0);
+  EXPECT_CALL(*result, SuccessInternal(nullptr));
+
+  camera->AddPendingResult(PendingResultType::kResumePreview,
+                           std::move(result));
+
+  camera->OnResumePreviewSucceeded();
+}
+
+TEST(Camera, OnResumePreviewFailedReturnsError) {
+  std::unique_ptr<CameraImpl> camera =
+      std::make_unique<CameraImpl>(MOCK_DEVICE_ID);
+  std::unique_ptr<MockMethodResult> result =
+      std::make_unique<MockMethodResult>();
+
+  std::string error_text = "error_text";
+
+  EXPECT_CALL(*result, SuccessInternal).Times(0);
+  EXPECT_CALL(*result, ErrorInternal(_, Eq(error_text), _));
+
+  camera->AddPendingResult(PendingResultType::kResumePreview,
+                           std::move(result));
+
+  camera->OnResumePreviewFailed(error_text);
+}
+
+TEST(Camera, OnStartRecordSucceededReturnsSuccess) {
+  std::unique_ptr<CameraImpl> camera =
+      std::make_unique<CameraImpl>(MOCK_DEVICE_ID);
+  std::unique_ptr<MockMethodResult> result =
+      std::make_unique<MockMethodResult>();
+
+  EXPECT_CALL(*result, ErrorInternal).Times(0);
+  EXPECT_CALL(*result, SuccessInternal(nullptr));
+
+  camera->AddPendingResult(PendingResultType::kStartRecord, std::move(result));
+
+  camera->OnStartRecordSucceeded();
+}
+
+TEST(Camera, OnStartRecordFailedReturnsError) {
+  std::unique_ptr<CameraImpl> camera =
+      std::make_unique<CameraImpl>(MOCK_DEVICE_ID);
+  std::unique_ptr<MockMethodResult> result =
+      std::make_unique<MockMethodResult>();
+
+  std::string error_text = "error_text";
+
+  EXPECT_CALL(*result, SuccessInternal).Times(0);
+  EXPECT_CALL(*result, ErrorInternal(_, Eq(error_text), _));
+
+  camera->AddPendingResult(PendingResultType::kStartRecord, std::move(result));
+
+  camera->OnStartRecordFailed(error_text);
+}
+
+TEST(Camera, OnStopRecordSucceededReturnsSuccess) {
+  std::unique_ptr<CameraImpl> camera =
+      std::make_unique<CameraImpl>(MOCK_DEVICE_ID);
+  std::unique_ptr<MockMethodResult> result =
+      std::make_unique<MockMethodResult>();
+
+  std::string file_path = "C:\temp\filename.mp4";
+
+  EXPECT_CALL(*result, ErrorInternal).Times(0);
+  EXPECT_CALL(*result, SuccessInternal(Pointee(EncodableValue(file_path))));
+
+  camera->AddPendingResult(PendingResultType::kStopRecord, std::move(result));
+
+  camera->OnStopRecordSucceeded(file_path);
+}
+
+TEST(Camera, OnStopRecordFailedReturnsError) {
+  std::unique_ptr<CameraImpl> camera =
+      std::make_unique<CameraImpl>(MOCK_DEVICE_ID);
+  std::unique_ptr<MockMethodResult> result =
+      std::make_unique<MockMethodResult>();
+
+  std::string error_text = "error_text";
+
+  EXPECT_CALL(*result, SuccessInternal).Times(0);
+  EXPECT_CALL(*result, ErrorInternal(_, Eq(error_text), _));
+
+  camera->AddPendingResult(PendingResultType::kStopRecord, std::move(result));
+
+  camera->OnStopRecordFailed(error_text);
+}
+
+TEST(Camera, OnTakePictureSucceededReturnsSuccess) {
+  std::unique_ptr<CameraImpl> camera =
+      std::make_unique<CameraImpl>(MOCK_DEVICE_ID);
+  std::unique_ptr<MockMethodResult> result =
+      std::make_unique<MockMethodResult>();
+
+  std::string file_path = "C:\temp\filename.jpeg";
+
+  EXPECT_CALL(*result, ErrorInternal).Times(0);
+  EXPECT_CALL(*result, SuccessInternal(Pointee(EncodableValue(file_path))));
+
+  camera->AddPendingResult(PendingResultType::kTakePicture, std::move(result));
+
+  camera->OnTakePictureSucceeded(file_path);
+}
+
+TEST(Camera, OnTakePictureFailedReturnsError) {
+  std::unique_ptr<CameraImpl> camera =
+      std::make_unique<CameraImpl>(MOCK_DEVICE_ID);
+  std::unique_ptr<MockMethodResult> result =
+      std::make_unique<MockMethodResult>();
+
+  std::string error_text = "error_text";
+
+  EXPECT_CALL(*result, SuccessInternal).Times(0);
+  EXPECT_CALL(*result, ErrorInternal(_, Eq(error_text), _));
+
+  camera->AddPendingResult(PendingResultType::kTakePicture, std::move(result));
+
+  camera->OnTakePictureFailed(error_text);
+}
+
+TEST(Camera, OnVideoRecordSucceededInvokesCameraChannelEvent) {
+  std::unique_ptr<CameraImpl> camera =
+      std::make_unique<CameraImpl>(MOCK_DEVICE_ID);
+  std::unique_ptr<MockCaptureControllerFactory> capture_controller_factory =
+      std::make_unique<MockCaptureControllerFactory>();
+
+  std::unique_ptr<MockBinaryMessenger> binary_messenger =
+      std::make_unique<MockBinaryMessenger>();
+
+  std::string file_path = "C:\temp\filename.mp4";
+  int64_t camera_id = 12345;
+  std::string camera_channel =
+      std::string("plugins.flutter.io/camera_windows/camera") +
+      std::to_string(camera_id);
+  int64_t video_duration = 1000000;
+
+  EXPECT_CALL(*capture_controller_factory, CreateCaptureController)
+      .Times(1)
+      .WillOnce(
+          []() { return std::make_unique<NiceMock<MockCaptureController>>(); });
+
+  // TODO: test binary content.
+  // First time is video record success message,
+  // and second is camera closing message.
+  EXPECT_CALL(*binary_messenger, Send(Eq(camera_channel), _, _, _)).Times(2);
+
+  // Init camera with mock capture controller factory
+  camera->InitCamera(std::move(capture_controller_factory),
+                     std::make_unique<MockTextureRegistrar>().get(),
+                     binary_messenger.get(), false, ResolutionPreset::kAuto);
+
+  // Pass camera id for camera
+  camera->OnCreateCaptureEngineSucceeded(camera_id);
+
+  camera->OnVideoRecordSucceeded(file_path, video_duration);
+
+  // Dispose camera before message channel.
+  camera = nullptr;
+}
+
+}  // namespace test
+}  // namespace camera_windows
diff --git a/packages/camera/camera_windows/windows/test/capture_controller_test.cpp b/packages/camera/camera_windows/windows/test/capture_controller_test.cpp
new file mode 100644
index 0000000..7520af7
--- /dev/null
+++ b/packages/camera/camera_windows/windows/test/capture_controller_test.cpp
@@ -0,0 +1,503 @@
+// Copyright 2013 The Flutter Authors. All rights reserved.
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+#include "capture_controller.h"
+
+#include <flutter/method_call.h>
+#include <flutter/method_result_functions.h>
+#include <flutter/standard_method_codec.h>
+#include <flutter/texture_registrar.h>
+#include <gmock/gmock.h>
+#include <gtest/gtest.h>
+#include <windows.h>
+#include <wrl/client.h>
+
+#include <functional>
+#include <memory>
+#include <string>
+
+#include "mocks.h"
+#include "string_utils.h"
+
+namespace camera_windows {
+
+namespace test {
+
+using Microsoft::WRL::ComPtr;
+using ::testing::_;
+using ::testing::Eq;
+using ::testing::Return;
+
+void MockInitCaptureController(CaptureControllerImpl* capture_controller,
+                               MockTextureRegistrar* texture_registrar,
+                               MockCaptureEngine* engine, MockCamera* camera,
+                               int64_t mock_texture_id) {
+  ComPtr<MockMediaSource> video_source = new MockMediaSource();
+  ComPtr<MockMediaSource> audio_source = new MockMediaSource();
+
+  capture_controller->SetCaptureEngine(
+      reinterpret_cast<IMFCaptureEngine*>(engine));
+  capture_controller->SetVideoSource(
+      reinterpret_cast<IMFMediaSource*>(video_source.Get()));
+  capture_controller->SetAudioSource(
+      reinterpret_cast<IMFMediaSource*>(audio_source.Get()));
+
+  EXPECT_CALL(*texture_registrar, RegisterTexture)
+      .Times(1)
+      .WillOnce([reg = texture_registrar,
+                 mock_texture_id](flutter::TextureVariant* texture) -> int64_t {
+        EXPECT_TRUE(texture);
+        reg->texture_ = texture;
+        reg->texture_id_ = mock_texture_id;
+        return reg->texture_id_;
+      });
+  EXPECT_CALL(*texture_registrar, UnregisterTexture(Eq(mock_texture_id)))
+      .Times(1);
+  EXPECT_CALL(*camera, OnCreateCaptureEngineFailed).Times(0);
+  EXPECT_CALL(*camera, OnCreateCaptureEngineSucceeded(Eq(mock_texture_id)))
+      .Times(1);
+  EXPECT_CALL(*engine, Initialize).Times(1);
+
+  capture_controller->InitCaptureDevice(texture_registrar, MOCK_DEVICE_ID, true,
+                                        ResolutionPreset::kAuto);
+
+  // MockCaptureEngine::Initialize is called
+  EXPECT_TRUE(engine->initialized_);
+
+  engine->CreateFakeEvent(S_OK, MF_CAPTURE_ENGINE_INITIALIZED);
+}
+
+void MockStartPreview(CaptureControllerImpl* capture_controller,
+                      MockCaptureSource* capture_source,
+                      MockCapturePreviewSink* preview_sink,
+                      MockTextureRegistrar* texture_registrar,
+                      MockCaptureEngine* engine, MockCamera* camera,
+                      std::unique_ptr<uint8_t[]> mock_source_buffer,
+                      uint32_t mock_source_buffer_size,
+                      uint32_t mock_preview_width, uint32_t mock_preview_height,
+                      int64_t mock_texture_id) {
+  EXPECT_CALL(*engine, GetSink(MF_CAPTURE_ENGINE_SINK_TYPE_PREVIEW, _))
+      .Times(1)
+      .WillOnce([src_sink = preview_sink](MF_CAPTURE_ENGINE_SINK_TYPE sink_type,
+                                          IMFCaptureSink** target_sink) {
+        *target_sink = src_sink;
+        src_sink->AddRef();
+        return S_OK;
+      });
+
+  EXPECT_CALL(*preview_sink, RemoveAllStreams).Times(1).WillOnce(Return(S_OK));
+  EXPECT_CALL(*preview_sink, AddStream).Times(1).WillOnce(Return(S_OK));
+  EXPECT_CALL(*preview_sink, SetSampleCallback)
+      .Times(1)
+      .WillOnce([sink = preview_sink](
+                    DWORD dwStreamSinkIndex,
+                    IMFCaptureEngineOnSampleCallback* pCallback) -> HRESULT {
+        sink->sample_callback_ = pCallback;
+        return S_OK;
+      });
+
+  EXPECT_CALL(*engine, GetSource)
+      .Times(1)
+      .WillOnce(
+          [src_source = capture_source](IMFCaptureSource** target_source) {
+            *target_source = src_source;
+            src_source->AddRef();
+            return S_OK;
+          });
+
+  EXPECT_CALL(
+      *capture_source,
+      GetAvailableDeviceMediaType(
+          Eq((DWORD)
+                 MF_CAPTURE_ENGINE_PREFERRED_SOURCE_STREAM_FOR_VIDEO_PREVIEW),
+          _, _))
+      .WillRepeatedly([mock_preview_width, mock_preview_height](
+                          DWORD stream_index, DWORD media_type_index,
+                          IMFMediaType** media_type) {
+        // We give only one media type to loop through
+        if (media_type_index != 0) return MF_E_NO_MORE_TYPES;
+        *media_type =
+            new FakeMediaType(MFMediaType_Video, MFVideoFormat_RGB32,
+                              mock_preview_width, mock_preview_height);
+        (*media_type)->AddRef();
+        return S_OK;
+      });
+
+  EXPECT_CALL(
+      *capture_source,
+      GetAvailableDeviceMediaType(
+          Eq((DWORD)MF_CAPTURE_ENGINE_PREFERRED_SOURCE_STREAM_FOR_VIDEO_RECORD),
+          _, _))
+      .WillRepeatedly([mock_preview_width, mock_preview_height](
+                          DWORD stream_index, DWORD media_type_index,
+                          IMFMediaType** media_type) {
+        // We give only one media type to loop through
+        if (media_type_index != 0) return MF_E_NO_MORE_TYPES;
+        *media_type =
+            new FakeMediaType(MFMediaType_Video, MFVideoFormat_RGB32,
+                              mock_preview_width, mock_preview_height);
+        (*media_type)->AddRef();
+        return S_OK;
+      });
+
+  EXPECT_CALL(*engine, StartPreview()).Times(1).WillOnce(Return(S_OK));
+
+  // Called by destructor
+  EXPECT_CALL(*engine, StopPreview()).Times(1).WillOnce(Return(S_OK));
+
+  // Called after first processed sample
+  EXPECT_CALL(*camera,
+              OnStartPreviewSucceeded(mock_preview_width, mock_preview_height))
+      .Times(1);
+  EXPECT_CALL(*camera, OnStartPreviewFailed).Times(0);
+  EXPECT_CALL(*texture_registrar, MarkTextureFrameAvailable(mock_texture_id))
+      .Times(1);
+
+  capture_controller->StartPreview();
+
+  EXPECT_EQ(capture_controller->GetPreviewHeight(), mock_preview_height);
+  EXPECT_EQ(capture_controller->GetPreviewWidth(), mock_preview_width);
+
+  // Capture engine is now started and will first send event of started preview
+  engine->CreateFakeEvent(S_OK, MF_CAPTURE_ENGINE_PREVIEW_STARTED);
+
+  // SendFake sample
+  preview_sink->SendFakeSample(mock_source_buffer.get(),
+                               mock_source_buffer_size);
+}
+
+void MockRecordStart(CaptureControllerImpl* capture_controller,
+                     MockCaptureEngine* engine,
+                     MockCaptureRecordSink* record_sink, MockCamera* camera,
+                     const std::string& mock_path_to_video) {
+  EXPECT_CALL(*engine, StartRecord()).Times(1).WillOnce(Return(S_OK));
+
+  EXPECT_CALL(*engine, GetSink(MF_CAPTURE_ENGINE_SINK_TYPE_RECORD, _))
+      .Times(1)
+      .WillOnce([src_sink = record_sink](MF_CAPTURE_ENGINE_SINK_TYPE sink_type,
+                                         IMFCaptureSink** target_sink) {
+        *target_sink = src_sink;
+        src_sink->AddRef();
+        return S_OK;
+      });
+
+  EXPECT_CALL(*record_sink, RemoveAllStreams).Times(1).WillOnce(Return(S_OK));
+  EXPECT_CALL(*record_sink, AddStream).Times(2).WillRepeatedly(Return(S_OK));
+  EXPECT_CALL(*record_sink, SetOutputFileName).Times(1).WillOnce(Return(S_OK));
+
+  capture_controller->StartRecord(mock_path_to_video, -1);
+
+  EXPECT_CALL(*camera, OnStartRecordSucceeded()).Times(1);
+  engine->CreateFakeEvent(S_OK, MF_CAPTURE_ENGINE_RECORD_STARTED);
+}
+
+TEST(CaptureController,
+     InitCaptureEngineCallsOnCreateCaptureEngineSucceededWithTextureId) {
+  ComPtr<MockCaptureEngine> engine = new MockCaptureEngine();
+  std::unique_ptr<MockCamera> camera =
+      std::make_unique<MockCamera>(MOCK_DEVICE_ID);
+  std::unique_ptr<CaptureControllerImpl> capture_controller =
+      std::make_unique<CaptureControllerImpl>(camera.get());
+  std::unique_ptr<MockTextureRegistrar> texture_registrar =
+      std::make_unique<MockTextureRegistrar>();
+
+  uint64_t mock_texture_id = 1234;
+
+  // Init capture controller with mocks and tests
+  MockInitCaptureController(capture_controller.get(), texture_registrar.get(),
+                            engine.Get(), camera.get(), mock_texture_id);
+
+  capture_controller = nullptr;
+  camera = nullptr;
+  texture_registrar = nullptr;
+  engine = nullptr;
+}
+
+TEST(CaptureController, StartPreviewStartsProcessingSamples) {
+  ComPtr<MockCaptureEngine> engine = new MockCaptureEngine();
+  std::unique_ptr<MockCamera> camera =
+      std::make_unique<MockCamera>(MOCK_DEVICE_ID);
+  std::unique_ptr<CaptureControllerImpl> capture_controller =
+      std::make_unique<CaptureControllerImpl>(camera.get());
+  std::unique_ptr<MockTextureRegistrar> texture_registrar =
+      std::make_unique<MockTextureRegistrar>();
+
+  uint64_t mock_texture_id = 1234;
+
+  // Initialize capture controller to be able to start preview
+  MockInitCaptureController(capture_controller.get(), texture_registrar.get(),
+                            engine.Get(), camera.get(), mock_texture_id);
+
+  ComPtr<MockCapturePreviewSink> preview_sink = new MockCapturePreviewSink();
+  ComPtr<MockCaptureSource> capture_source = new MockCaptureSource();
+
+  // Let's keep these small for mock texture data. Two pixels should be
+  // enough.
+  uint32_t mock_preview_width = 2;
+  uint32_t mock_preview_height = 1;
+  uint32_t pixels_total = mock_preview_width * mock_preview_height;
+  uint32_t pixel_size = 4;
+
+  // Build mock texture
+  uint32_t mock_texture_data_size = pixels_total * pixel_size;
+
+  std::unique_ptr<uint8_t[]> mock_source_buffer =
+      std::make_unique<uint8_t[]>(mock_texture_data_size);
+
+  uint8_t mock_red_pixel = 0x11;
+  uint8_t mock_green_pixel = 0x22;
+  uint8_t mock_blue_pixel = 0x33;
+  MFVideoFormatRGB32Pixel* mock_source_buffer_data =
+      (MFVideoFormatRGB32Pixel*)mock_source_buffer.get();
+
+  for (uint32_t i = 0; i < pixels_total; i++) {
+    mock_source_buffer_data[i].r = mock_red_pixel;
+    mock_source_buffer_data[i].g = mock_green_pixel;
+    mock_source_buffer_data[i].b = mock_blue_pixel;
+  }
+
+  // Start preview and run preview tests
+  MockStartPreview(capture_controller.get(), capture_source.Get(),
+                   preview_sink.Get(), texture_registrar.get(), engine.Get(),
+                   camera.get(), std::move(mock_source_buffer),
+                   mock_texture_data_size, mock_preview_width,
+                   mock_preview_height, mock_texture_id);
+
+  // Test texture processing
+  EXPECT_TRUE(texture_registrar->texture_);
+  if (texture_registrar->texture_) {
+    auto pixel_buffer_texture =
+        std::get_if<flutter::PixelBufferTexture>(texture_registrar->texture_);
+    EXPECT_TRUE(pixel_buffer_texture);
+
+    if (pixel_buffer_texture) {
+      auto converted_buffer =
+          pixel_buffer_texture->CopyPixelBuffer((size_t)100, (size_t)100);
+
+      EXPECT_TRUE(converted_buffer);
+      if (converted_buffer) {
+        EXPECT_EQ(converted_buffer->height, mock_preview_height);
+        EXPECT_EQ(converted_buffer->width, mock_preview_width);
+
+        FlutterDesktopPixel* converted_buffer_data =
+            (FlutterDesktopPixel*)(converted_buffer->buffer);
+
+        for (uint32_t i = 0; i < pixels_total; i++) {
+          EXPECT_EQ(converted_buffer_data[i].r, mock_red_pixel);
+          EXPECT_EQ(converted_buffer_data[i].g, mock_green_pixel);
+          EXPECT_EQ(converted_buffer_data[i].b, mock_blue_pixel);
+        }
+
+        // Call release callback to get mutex lock unlocked.
+        converted_buffer->release_callback(converted_buffer->release_context);
+      }
+      converted_buffer = nullptr;
+    }
+    pixel_buffer_texture = nullptr;
+  }
+
+  capture_controller = nullptr;
+  engine = nullptr;
+  camera = nullptr;
+  texture_registrar = nullptr;
+}
+
+TEST(CaptureController, StartRecordSuccess) {
+  ComPtr<MockCaptureEngine> engine = new MockCaptureEngine();
+  std::unique_ptr<MockCamera> camera =
+      std::make_unique<MockCamera>(MOCK_DEVICE_ID);
+  std::unique_ptr<CaptureControllerImpl> capture_controller =
+      std::make_unique<CaptureControllerImpl>(camera.get());
+  std::unique_ptr<MockTextureRegistrar> texture_registrar =
+      std::make_unique<MockTextureRegistrar>();
+
+  uint64_t mock_texture_id = 1234;
+
+  // Initialize capture controller to be able to start preview
+  MockInitCaptureController(capture_controller.get(), texture_registrar.get(),
+                            engine.Get(), camera.get(), mock_texture_id);
+
+  ComPtr<MockCapturePreviewSink> preview_sink = new MockCapturePreviewSink();
+  ComPtr<MockCaptureSource> capture_source = new MockCaptureSource();
+
+  std::unique_ptr<uint8_t[]> mock_source_buffer =
+      std::make_unique<uint8_t[]>(0);
+
+  // Start preview to be able to start record
+  MockStartPreview(capture_controller.get(), capture_source.Get(),
+                   preview_sink.Get(), texture_registrar.get(), engine.Get(),
+                   camera.get(), std::move(mock_source_buffer), 0, 1, 1,
+                   mock_texture_id);
+
+  // Start record
+  ComPtr<MockCaptureRecordSink> record_sink = new MockCaptureRecordSink();
+  std::string mock_path_to_video = "mock_path_to_video";
+  MockRecordStart(capture_controller.get(), engine.Get(), record_sink.Get(),
+                  camera.get(), mock_path_to_video);
+
+  // Called by destructor
+  EXPECT_CALL(*(engine.Get()), StopRecord(true, false))
+      .Times(1)
+      .WillOnce(Return(S_OK));
+
+  capture_controller = nullptr;
+  texture_registrar = nullptr;
+  engine = nullptr;
+  camera = nullptr;
+  record_sink = nullptr;
+}
+
+TEST(CaptureController, StopRecordSuccess) {
+  ComPtr<MockCaptureEngine> engine = new MockCaptureEngine();
+  std::unique_ptr<MockCamera> camera =
+      std::make_unique<MockCamera>(MOCK_DEVICE_ID);
+  std::unique_ptr<CaptureControllerImpl> capture_controller =
+      std::make_unique<CaptureControllerImpl>(camera.get());
+  std::unique_ptr<MockTextureRegistrar> texture_registrar =
+      std::make_unique<MockTextureRegistrar>();
+
+  uint64_t mock_texture_id = 1234;
+
+  // Initialize capture controller to be able to start preview
+  MockInitCaptureController(capture_controller.get(), texture_registrar.get(),
+                            engine.Get(), camera.get(), mock_texture_id);
+
+  ComPtr<MockCapturePreviewSink> preview_sink = new MockCapturePreviewSink();
+  ComPtr<MockCaptureSource> capture_source = new MockCaptureSource();
+
+  std::unique_ptr<uint8_t[]> mock_source_buffer =
+      std::make_unique<uint8_t[]>(0);
+
+  // Start preview to be able to start record
+  MockStartPreview(capture_controller.get(), capture_source.Get(),
+                   preview_sink.Get(), texture_registrar.get(), engine.Get(),
+                   camera.get(), std::move(mock_source_buffer), 0, 1, 1,
+                   mock_texture_id);
+
+  // Start record
+  ComPtr<MockCaptureRecordSink> record_sink = new MockCaptureRecordSink();
+  std::string mock_path_to_video = "mock_path_to_video";
+  MockRecordStart(capture_controller.get(), engine.Get(), record_sink.Get(),
+                  camera.get(), mock_path_to_video);
+
+  // Request to stop record
+  EXPECT_CALL(*(engine.Get()), StopRecord(true, false))
+      .Times(1)
+      .WillOnce(Return(S_OK));
+  capture_controller->StopRecord();
+
+  // OnStopRecordSucceeded should be called with mocked file path
+  EXPECT_CALL(*camera, OnStopRecordSucceeded(Eq(mock_path_to_video))).Times(1);
+  engine->CreateFakeEvent(S_OK, MF_CAPTURE_ENGINE_RECORD_STOPPED);
+
+  capture_controller = nullptr;
+  texture_registrar = nullptr;
+  engine = nullptr;
+  camera = nullptr;
+  record_sink = nullptr;
+}
+
+TEST(CaptureController, TakePictureSuccess) {
+  ComPtr<MockCaptureEngine> engine = new MockCaptureEngine();
+  std::unique_ptr<MockCamera> camera =
+      std::make_unique<MockCamera>(MOCK_DEVICE_ID);
+  std::unique_ptr<CaptureControllerImpl> capture_controller =
+      std::make_unique<CaptureControllerImpl>(camera.get());
+  std::unique_ptr<MockTextureRegistrar> texture_registrar =
+      std::make_unique<MockTextureRegistrar>();
+
+  uint64_t mock_texture_id = 1234;
+
+  // Initialize capture controller to be able to start preview
+  MockInitCaptureController(capture_controller.get(), texture_registrar.get(),
+                            engine.Get(), camera.get(), mock_texture_id);
+
+  ComPtr<MockCapturePreviewSink> preview_sink = new MockCapturePreviewSink();
+  ComPtr<MockCaptureSource> capture_source = new MockCaptureSource();
+
+  std::unique_ptr<uint8_t[]> mock_source_buffer =
+      std::make_unique<uint8_t[]>(0);
+
+  // Start preview to be able to start record
+  MockStartPreview(capture_controller.get(), capture_source.Get(),
+                   preview_sink.Get(), texture_registrar.get(), engine.Get(),
+                   camera.get(), std::move(mock_source_buffer), 0, 1, 1,
+                   mock_texture_id);
+
+  // Init photo sink tests
+  ComPtr<MockCapturePhotoSink> photo_sink = new MockCapturePhotoSink();
+  EXPECT_CALL(*(engine.Get()), GetSink(MF_CAPTURE_ENGINE_SINK_TYPE_PHOTO, _))
+      .Times(1)
+      .WillOnce(
+          [src_sink = photo_sink.Get()](MF_CAPTURE_ENGINE_SINK_TYPE sink_type,
+                                        IMFCaptureSink** target_sink) {
+            *target_sink = src_sink;
+            src_sink->AddRef();
+            return S_OK;
+          });
+  EXPECT_CALL(*(photo_sink.Get()), RemoveAllStreams)
+      .Times(1)
+      .WillOnce(Return(S_OK));
+  EXPECT_CALL(*(photo_sink.Get()), AddStream).Times(1).WillOnce(Return(S_OK));
+  EXPECT_CALL(*(photo_sink.Get()), SetOutputFileName)
+      .Times(1)
+      .WillOnce(Return(S_OK));
+
+  // Request photo
+  std::string mock_path_to_photo = "mock_path_to_photo";
+  EXPECT_CALL(*(engine.Get()), TakePhoto()).Times(1).WillOnce(Return(S_OK));
+  capture_controller->TakePicture(mock_path_to_photo);
+
+  // OnTakePictureSucceeded should be called with mocked file path
+  EXPECT_CALL(*camera, OnTakePictureSucceeded(Eq(mock_path_to_photo))).Times(1);
+  engine->CreateFakeEvent(S_OK, MF_CAPTURE_ENGINE_PHOTO_TAKEN);
+
+  capture_controller = nullptr;
+  texture_registrar = nullptr;
+  engine = nullptr;
+  camera = nullptr;
+  photo_sink = nullptr;
+}
+
+TEST(CaptureController, PauseResumePreviewSuccess) {
+  ComPtr<MockCaptureEngine> engine = new MockCaptureEngine();
+  std::unique_ptr<MockCamera> camera =
+      std::make_unique<MockCamera>(MOCK_DEVICE_ID);
+  std::unique_ptr<CaptureControllerImpl> capture_controller =
+      std::make_unique<CaptureControllerImpl>(camera.get());
+  std::unique_ptr<MockTextureRegistrar> texture_registrar =
+      std::make_unique<MockTextureRegistrar>();
+
+  uint64_t mock_texture_id = 1234;
+
+  // Initialize capture controller to be able to start preview
+  MockInitCaptureController(capture_controller.get(), texture_registrar.get(),
+                            engine.Get(), camera.get(), mock_texture_id);
+
+  ComPtr<MockCapturePreviewSink> preview_sink = new MockCapturePreviewSink();
+  ComPtr<MockCaptureSource> capture_source = new MockCaptureSource();
+
+  std::unique_ptr<uint8_t[]> mock_source_buffer =
+      std::make_unique<uint8_t[]>(0);
+
+  // Start preview to be able to start record
+  MockStartPreview(capture_controller.get(), capture_source.Get(),
+                   preview_sink.Get(), texture_registrar.get(), engine.Get(),
+                   camera.get(), std::move(mock_source_buffer), 0, 1, 1,
+                   mock_texture_id);
+
+  EXPECT_CALL(*camera, OnPausePreviewSucceeded()).Times(1);
+  capture_controller->PausePreview();
+
+  EXPECT_CALL(*camera, OnResumePreviewSucceeded()).Times(1);
+  capture_controller->ResumePreview();
+
+  capture_controller = nullptr;
+  texture_registrar = nullptr;
+  engine = nullptr;
+  camera = nullptr;
+}
+
+}  // namespace test
+}  // namespace camera_windows
diff --git a/packages/camera/camera_windows/windows/test/mocks.h b/packages/camera/camera_windows/windows/test/mocks.h
new file mode 100644
index 0000000..0781989
--- /dev/null
+++ b/packages/camera/camera_windows/windows/test/mocks.h
@@ -0,0 +1,1015 @@
+// Copyright 2013 The Flutter Authors. All rights reserved.
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+#ifndef PACKAGES_CAMERA_CAMERA_WINDOWS_WINDOWS_TEST_MOCKS_H_
+#define PACKAGES_CAMERA_CAMERA_WINDOWS_WINDOWS_TEST_MOCKS_H_
+
+#include <flutter/method_call.h>
+#include <flutter/method_result_functions.h>
+#include <flutter/standard_method_codec.h>
+#include <flutter/texture_registrar.h>
+#include <gmock/gmock.h>
+#include <gtest/gtest.h>
+#include <mfcaptureengine.h>
+
+#include "camera.h"
+#include "camera_plugin.h"
+#include "capture_controller.h"
+#include "capture_controller_listener.h"
+#include "capture_engine_listener.h"
+
+namespace camera_windows {
+namespace test {
+
+namespace {
+
+using flutter::EncodableMap;
+using flutter::EncodableValue;
+using ::testing::_;
+
+class MockMethodResult : public flutter::MethodResult<> {
+ public:
+  ~MockMethodResult() = default;
+
+  MOCK_METHOD(void, SuccessInternal, (const EncodableValue* result),
+              (override));
+  MOCK_METHOD(void, ErrorInternal,
+              (const std::string& error_code, const std::string& error_message,
+               const EncodableValue* details),
+              (override));
+  MOCK_METHOD(void, NotImplementedInternal, (), (override));
+};
+
+class MockBinaryMessenger : public flutter::BinaryMessenger {
+ public:
+  ~MockBinaryMessenger() = default;
+
+  MOCK_METHOD(void, Send,
+              (const std::string& channel, const uint8_t* message,
+               size_t message_size, flutter::BinaryReply reply),
+              (const));
+
+  MOCK_METHOD(void, SetMessageHandler,
+              (const std::string& channel,
+               flutter::BinaryMessageHandler handler),
+              ());
+};
+
+class MockTextureRegistrar : public flutter::TextureRegistrar {
+ public:
+  MockTextureRegistrar() {
+    ON_CALL(*this, RegisterTexture)
+        .WillByDefault([this](flutter::TextureVariant* texture) -> int64_t {
+          EXPECT_TRUE(texture);
+          this->texture_ = texture;
+          this->texture_id_ = 1000;
+          return this->texture_id_;
+        });
+
+    ON_CALL(*this, UnregisterTexture)
+        .WillByDefault([this](int64_t tid) -> bool {
+          if (tid == this->texture_id_) {
+            texture_ = nullptr;
+            this->texture_id_ = -1;
+            return true;
+          }
+          return false;
+        });
+
+    ON_CALL(*this, MarkTextureFrameAvailable)
+        .WillByDefault([this](int64_t tid) -> bool {
+          if (tid == this->texture_id_) {
+            return true;
+          }
+          return false;
+        });
+  }
+
+  ~MockTextureRegistrar() { texture_ = nullptr; }
+
+  MOCK_METHOD(int64_t, RegisterTexture, (flutter::TextureVariant * texture),
+              (override));
+
+  MOCK_METHOD(bool, UnregisterTexture, (int64_t), (override));
+  MOCK_METHOD(bool, MarkTextureFrameAvailable, (int64_t), (override));
+
+  int64_t texture_id_ = -1;
+  flutter::TextureVariant* texture_ = nullptr;
+};
+
+class MockCameraFactory : public CameraFactory {
+ public:
+  MockCameraFactory() {
+    ON_CALL(*this, CreateCamera).WillByDefault([this]() {
+      assert(this->pending_camera_);
+      return std::move(this->pending_camera_);
+    });
+  }
+
+  ~MockCameraFactory() = default;
+
+  // Disallow copy and move.
+  MockCameraFactory(const MockCameraFactory&) = delete;
+  MockCameraFactory& operator=(const MockCameraFactory&) = delete;
+
+  MOCK_METHOD(std::unique_ptr<Camera>, CreateCamera,
+              (const std::string& device_id), (override));
+
+  std::unique_ptr<Camera> pending_camera_;
+};
+
+class MockCamera : public Camera {
+ public:
+  MockCamera(const std::string& device_id)
+      : device_id_(device_id), Camera(device_id){};
+
+  ~MockCamera() = default;
+
+  // Disallow copy and move.
+  MockCamera(const MockCamera&) = delete;
+  MockCamera& operator=(const MockCamera&) = delete;
+
+  MOCK_METHOD(void, OnCreateCaptureEngineSucceeded, (int64_t texture_id),
+              (override));
+  MOCK_METHOD(std::unique_ptr<flutter::MethodResult<>>, GetPendingResultByType,
+              (PendingResultType type));
+  MOCK_METHOD(void, OnCreateCaptureEngineFailed, (const std::string& error),
+              (override));
+
+  MOCK_METHOD(void, OnStartPreviewSucceeded, (int32_t width, int32_t height),
+              (override));
+  MOCK_METHOD(void, OnStartPreviewFailed, (const std::string& error),
+              (override));
+
+  MOCK_METHOD(void, OnResumePreviewSucceeded, (), (override));
+  MOCK_METHOD(void, OnResumePreviewFailed, (const std::string& error),
+              (override));
+
+  MOCK_METHOD(void, OnPausePreviewSucceeded, (), (override));
+  MOCK_METHOD(void, OnPausePreviewFailed, (const std::string& error),
+              (override));
+
+  MOCK_METHOD(void, OnStartRecordSucceeded, (), (override));
+  MOCK_METHOD(void, OnStartRecordFailed, (const std::string& error),
+              (override));
+
+  MOCK_METHOD(void, OnStopRecordSucceeded, (const std::string& file_path),
+              (override));
+  MOCK_METHOD(void, OnStopRecordFailed, (const std::string& error), (override));
+
+  MOCK_METHOD(void, OnTakePictureSucceeded, (const std::string& file_path),
+              (override));
+  MOCK_METHOD(void, OnTakePictureFailed, (const std::string& error),
+              (override));
+
+  MOCK_METHOD(void, OnVideoRecordSucceeded,
+              (const std::string& file_path, int64_t video_duration),
+              (override));
+  MOCK_METHOD(void, OnVideoRecordFailed, (const std::string& error),
+              (override));
+  MOCK_METHOD(void, OnCaptureError, (const std::string& error), (override));
+
+  MOCK_METHOD(bool, HasDeviceId, (std::string & device_id), (const override));
+  MOCK_METHOD(bool, HasCameraId, (int64_t camera_id), (const override));
+
+  MOCK_METHOD(bool, AddPendingResult,
+              (PendingResultType type, std::unique_ptr<MethodResult<>> result),
+              (override));
+  MOCK_METHOD(bool, HasPendingResultByType, (PendingResultType type),
+              (const override));
+
+  MOCK_METHOD(camera_windows::CaptureController*, GetCaptureController, (),
+              (override));
+
+  MOCK_METHOD(void, InitCamera,
+              (flutter::TextureRegistrar * texture_registrar,
+               flutter::BinaryMessenger* messenger, bool record_audio,
+               ResolutionPreset resolution_preset),
+              (override));
+
+  std::unique_ptr<CaptureController> capture_controller_;
+  std::unique_ptr<MethodResult<>> pending_result_;
+  std::string device_id_;
+  int64_t camera_id_ = -1;
+};
+
+class MockCaptureControllerFactory : public CaptureControllerFactory {
+ public:
+  MockCaptureControllerFactory(){};
+  virtual ~MockCaptureControllerFactory() = default;
+
+  // Disallow copy and move.
+  MockCaptureControllerFactory(const MockCaptureControllerFactory&) = delete;
+  MockCaptureControllerFactory& operator=(const MockCaptureControllerFactory&) =
+      delete;
+
+  MOCK_METHOD(std::unique_ptr<CaptureController>, CreateCaptureController,
+              (CaptureControllerListener * listener), (override));
+};
+
+class MockCaptureController : public CaptureController {
+ public:
+  ~MockCaptureController() = default;
+
+  MOCK_METHOD(void, InitCaptureDevice,
+              (flutter::TextureRegistrar * texture_registrar,
+               const std::string& device_id, bool record_audio,
+               ResolutionPreset resolution_preset),
+              (override));
+
+  MOCK_METHOD(uint32_t, GetPreviewWidth, (), (const override));
+  MOCK_METHOD(uint32_t, GetPreviewHeight, (), (const override));
+
+  // Actions
+  MOCK_METHOD(void, StartPreview, (), (override));
+  MOCK_METHOD(void, ResumePreview, (), (override));
+  MOCK_METHOD(void, PausePreview, (), (override));
+  MOCK_METHOD(void, StartRecord,
+              (const std::string& file_path, int64_t max_video_duration_ms),
+              (override));
+  MOCK_METHOD(void, StopRecord, (), (override));
+  MOCK_METHOD(void, TakePicture, (const std::string& file_path), (override));
+};
+
+// MockCameraPlugin extends CameraPlugin behaviour a bit to allow adding cameras
+// without creating them first with create message handler and mocking static
+// system calls
+class MockCameraPlugin : public CameraPlugin {
+ public:
+  MockCameraPlugin(flutter::TextureRegistrar* texture_registrar,
+                   flutter::BinaryMessenger* messenger)
+      : CameraPlugin(texture_registrar, messenger){};
+
+  // Creates a plugin instance with the given CameraFactory instance.
+  // Exists for unit testing with mock implementations.
+  MockCameraPlugin(flutter::TextureRegistrar* texture_registrar,
+                   flutter::BinaryMessenger* messenger,
+                   std::unique_ptr<CameraFactory> camera_factory)
+      : CameraPlugin(texture_registrar, messenger, std::move(camera_factory)){};
+
+  ~MockCameraPlugin() = default;
+
+  // Disallow copy and move.
+  MockCameraPlugin(const MockCameraPlugin&) = delete;
+  MockCameraPlugin& operator=(const MockCameraPlugin&) = delete;
+
+  MOCK_METHOD(bool, EnumerateVideoCaptureDeviceSources,
+              (IMFActivate * **devices, UINT32* count), (override));
+
+  // Helper to add camera without creating it via CameraFactory for testing
+  // purposes
+  void AddCamera(std::unique_ptr<Camera> camera) {
+    cameras_.push_back(std::move(camera));
+  }
+};
+
+class MockCaptureSource : public IMFCaptureSource {
+ public:
+  MockCaptureSource(){};
+  ~MockCaptureSource() = default;
+
+  // IUnknown
+  STDMETHODIMP_(ULONG) AddRef() { return InterlockedIncrement(&ref_); }
+
+  // IUnknown
+  STDMETHODIMP_(ULONG) Release() {
+    LONG ref = InterlockedDecrement(&ref_);
+    if (ref == 0) {
+      delete this;
+    }
+    return ref;
+  }
+
+  // IUnknown
+  STDMETHODIMP_(HRESULT) QueryInterface(const IID& riid, void** ppv) {
+    *ppv = nullptr;
+
+    if (riid == IID_IMFCaptureSource) {
+      *ppv = static_cast<IMFCaptureSource*>(this);
+      ((IUnknown*)*ppv)->AddRef();
+      return S_OK;
+    }
+
+    return E_NOINTERFACE;
+  }
+
+  MOCK_METHOD(HRESULT, GetCaptureDeviceSource,
+              (MF_CAPTURE_ENGINE_DEVICE_TYPE mfCaptureEngineDeviceType,
+               IMFMediaSource** ppMediaSource));
+  MOCK_METHOD(HRESULT, GetCaptureDeviceActivate,
+              (MF_CAPTURE_ENGINE_DEVICE_TYPE mfCaptureEngineDeviceType,
+               IMFActivate** ppActivate));
+  MOCK_METHOD(HRESULT, GetService,
+              (REFIID rguidService, REFIID riid, IUnknown** ppUnknown));
+  MOCK_METHOD(HRESULT, AddEffect,
+              (DWORD dwSourceStreamIndex, IUnknown* pUnknown));
+
+  MOCK_METHOD(HRESULT, RemoveEffect,
+              (DWORD dwSourceStreamIndex, IUnknown* pUnknown));
+  MOCK_METHOD(HRESULT, RemoveAllEffects, (DWORD dwSourceStreamIndex));
+  MOCK_METHOD(HRESULT, GetAvailableDeviceMediaType,
+              (DWORD dwSourceStreamIndex, DWORD dwMediaTypeIndex,
+               IMFMediaType** ppMediaType));
+  MOCK_METHOD(HRESULT, SetCurrentDeviceMediaType,
+              (DWORD dwSourceStreamIndex, IMFMediaType* pMediaType));
+  MOCK_METHOD(HRESULT, GetCurrentDeviceMediaType,
+              (DWORD dwSourceStreamIndex, IMFMediaType** ppMediaType));
+  MOCK_METHOD(HRESULT, GetDeviceStreamCount, (DWORD * pdwStreamCount));
+  MOCK_METHOD(HRESULT, GetDeviceStreamCategory,
+              (DWORD dwSourceStreamIndex,
+               MF_CAPTURE_ENGINE_STREAM_CATEGORY* pStreamCategory));
+  MOCK_METHOD(HRESULT, GetMirrorState,
+              (DWORD dwStreamIndex, BOOL* pfMirrorState));
+  MOCK_METHOD(HRESULT, SetMirrorState,
+              (DWORD dwStreamIndex, BOOL fMirrorState));
+  MOCK_METHOD(HRESULT, GetStreamIndexFromFriendlyName,
+              (UINT32 uifriendlyName, DWORD* pdwActualStreamIndex));
+
+ private:
+  volatile ULONG ref_ = 0;
+};
+
+// Uses IMFMediaSourceEx which has SetD3DManager method.
+class MockMediaSource : public IMFMediaSourceEx {
+ public:
+  MockMediaSource(){};
+  ~MockMediaSource() = default;
+
+  // IUnknown
+  STDMETHODIMP_(ULONG) AddRef() { return InterlockedIncrement(&ref_); }
+
+  // IUnknown
+  STDMETHODIMP_(ULONG) Release() {
+    LONG ref = InterlockedDecrement(&ref_);
+    if (ref == 0) {
+      delete this;
+    }
+    return ref;
+  }
+
+  // IUnknown
+  STDMETHODIMP_(HRESULT) QueryInterface(const IID& riid, void** ppv) {
+    *ppv = nullptr;
+
+    if (riid == IID_IMFMediaSource) {
+      *ppv = static_cast<IMFMediaSource*>(this);
+      ((IUnknown*)*ppv)->AddRef();
+      return S_OK;
+    }
+
+    return E_NOINTERFACE;
+  }
+
+  // IMFMediaSource
+  HRESULT GetCharacteristics(DWORD* dwCharacteristics) override {
+    return E_NOTIMPL;
+  }
+  // IMFMediaSource
+  HRESULT CreatePresentationDescriptor(
+      IMFPresentationDescriptor** presentationDescriptor) override {
+    return E_NOTIMPL;
+  }
+  // IMFMediaSource
+  HRESULT Start(IMFPresentationDescriptor* presentationDescriptor,
+                const GUID* guidTimeFormat,
+                const PROPVARIANT* varStartPosition) override {
+    return E_NOTIMPL;
+  }
+  // IMFMediaSource
+  HRESULT Stop(void) override { return E_NOTIMPL; }
+  // IMFMediaSource
+  HRESULT Pause(void) override { return E_NOTIMPL; }
+  // IMFMediaSource
+  HRESULT Shutdown(void) override { return E_NOTIMPL; }
+
+  // IMFMediaEventGenerator
+  HRESULT GetEvent(DWORD dwFlags, IMFMediaEvent** event) override {
+    return E_NOTIMPL;
+  }
+  // IMFMediaEventGenerator
+  HRESULT BeginGetEvent(IMFAsyncCallback* callback,
+                        IUnknown* unkState) override {
+    return E_NOTIMPL;
+  }
+  // IMFMediaEventGenerator
+  HRESULT EndGetEvent(IMFAsyncResult* result, IMFMediaEvent** event) override {
+    return E_NOTIMPL;
+  }
+  // IMFMediaEventGenerator
+  HRESULT QueueEvent(MediaEventType met, REFGUID guidExtendedType,
+                     HRESULT hrStatus, const PROPVARIANT* value) override {
+    return E_NOTIMPL;
+  }
+
+  // IMFMediaSourceEx
+  HRESULT GetSourceAttributes(IMFAttributes** attributes) { return E_NOTIMPL; }
+  // IMFMediaSourceEx
+  HRESULT GetStreamAttributes(DWORD stream_id, IMFAttributes** attributes) {
+    return E_NOTIMPL;
+  }
+  // IMFMediaSourceEx
+  HRESULT SetD3DManager(IUnknown* manager) { return S_OK; }
+
+ private:
+  volatile ULONG ref_ = 0;
+};
+
+class MockCapturePreviewSink : public IMFCapturePreviewSink {
+ public:
+  // IMFCaptureSink
+  MOCK_METHOD(HRESULT, GetOutputMediaType,
+              (DWORD dwSinkStreamIndex, IMFMediaType** ppMediaType));
+
+  // IMFCaptureSink
+  MOCK_METHOD(HRESULT, GetService,
+              (DWORD dwSinkStreamIndex, REFGUID rguidService, REFIID riid,
+               IUnknown** ppUnknown));
+
+  // IMFCaptureSink
+  MOCK_METHOD(HRESULT, AddStream,
+              (DWORD dwSourceStreamIndex, IMFMediaType* pMediaType,
+               IMFAttributes* pAttributes, DWORD* pdwSinkStreamIndex));
+
+  // IMFCaptureSink
+  MOCK_METHOD(HRESULT, Prepare, ());
+
+  // IMFCaptureSink
+  MOCK_METHOD(HRESULT, RemoveAllStreams, ());
+
+  // IMFCapturePreviewSink
+  MOCK_METHOD(HRESULT, SetRenderHandle, (HANDLE handle));
+
+  // IMFCapturePreviewSink
+  MOCK_METHOD(HRESULT, SetRenderSurface, (IUnknown * pSurface));
+
+  // IMFCapturePreviewSink
+  MOCK_METHOD(HRESULT, UpdateVideo,
+              (const MFVideoNormalizedRect* pSrc, const RECT* pDst,
+               const COLORREF* pBorderClr));
+
+  // IMFCapturePreviewSink
+  MOCK_METHOD(HRESULT, SetSampleCallback,
+              (DWORD dwStreamSinkIndex,
+               IMFCaptureEngineOnSampleCallback* pCallback));
+
+  // IMFCapturePreviewSink
+  MOCK_METHOD(HRESULT, GetMirrorState, (BOOL * pfMirrorState));
+
+  // IMFCapturePreviewSink
+  MOCK_METHOD(HRESULT, SetMirrorState, (BOOL fMirrorState));
+
+  // IMFCapturePreviewSink
+  MOCK_METHOD(HRESULT, GetRotation,
+              (DWORD dwStreamIndex, DWORD* pdwRotationValue));
+
+  // IMFCapturePreviewSink
+  MOCK_METHOD(HRESULT, SetRotation,
+              (DWORD dwStreamIndex, DWORD dwRotationValue));
+
+  // IMFCapturePreviewSink
+  MOCK_METHOD(HRESULT, SetCustomSink, (IMFMediaSink * pMediaSink));
+
+  // IUnknown
+  STDMETHODIMP_(ULONG) AddRef() { return InterlockedIncrement(&ref_); }
+
+  // IUnknown
+  STDMETHODIMP_(ULONG) Release() {
+    LONG ref = InterlockedDecrement(&ref_);
+    if (ref == 0) {
+      delete this;
+    }
+    return ref;
+  }
+
+  // IUnknown
+  STDMETHODIMP_(HRESULT) QueryInterface(const IID& riid, void** ppv) {
+    *ppv = nullptr;
+
+    if (riid == IID_IMFCapturePreviewSink) {
+      *ppv = static_cast<IMFCapturePreviewSink*>(this);
+      ((IUnknown*)*ppv)->AddRef();
+      return S_OK;
+    }
+
+    return E_NOINTERFACE;
+  }
+
+  void SendFakeSample(uint8_t* src_buffer, uint32_t size) {
+    assert(sample_callback_);
+    ComPtr<IMFSample> sample;
+    ComPtr<IMFMediaBuffer> buffer;
+    HRESULT hr = MFCreateSample(&sample);
+
+    if (SUCCEEDED(hr)) {
+      hr = MFCreateMemoryBuffer(size, &buffer);
+    }
+
+    if (SUCCEEDED(hr)) {
+      uint8_t* target_data;
+      if (SUCCEEDED(buffer->Lock(&target_data, nullptr, nullptr))) {
+        std::copy(src_buffer, src_buffer + size, target_data);
+      }
+      hr = buffer->Unlock();
+    }
+
+    if (SUCCEEDED(hr)) {
+      hr = buffer->SetCurrentLength(size);
+    }
+
+    if (SUCCEEDED(hr)) {
+      hr = sample->AddBuffer(buffer.Get());
+    }
+
+    if (SUCCEEDED(hr)) {
+      sample_callback_->OnSample(sample.Get());
+    }
+  }
+
+  ComPtr<IMFCaptureEngineOnSampleCallback> sample_callback_;
+
+ private:
+  ~MockCapturePreviewSink() = default;
+  volatile ULONG ref_ = 0;
+};
+
+class MockCaptureRecordSink : public IMFCaptureRecordSink {
+ public:
+  // IMFCaptureSink
+  MOCK_METHOD(HRESULT, GetOutputMediaType,
+              (DWORD dwSinkStreamIndex, IMFMediaType** ppMediaType));
+
+  // IMFCaptureSink
+  MOCK_METHOD(HRESULT, GetService,
+              (DWORD dwSinkStreamIndex, REFGUID rguidService, REFIID riid,
+               IUnknown** ppUnknown));
+
+  // IMFCaptureSink
+  MOCK_METHOD(HRESULT, AddStream,
+              (DWORD dwSourceStreamIndex, IMFMediaType* pMediaType,
+               IMFAttributes* pAttributes, DWORD* pdwSinkStreamIndex));
+
+  // IMFCaptureSink
+  MOCK_METHOD(HRESULT, Prepare, ());
+
+  // IMFCaptureSink
+  MOCK_METHOD(HRESULT, RemoveAllStreams, ());
+
+  // IMFCaptureRecordSink
+  MOCK_METHOD(HRESULT, SetOutputByteStream,
+              (IMFByteStream * pByteStream, REFGUID guidContainerType));
+
+  // IMFCaptureRecordSink
+  MOCK_METHOD(HRESULT, SetOutputFileName, (LPCWSTR fileName));
+
+  // IMFCaptureRecordSink
+  MOCK_METHOD(HRESULT, SetSampleCallback,
+              (DWORD dwStreamSinkIndex,
+               IMFCaptureEngineOnSampleCallback* pCallback));
+
+  // IMFCaptureRecordSink
+  MOCK_METHOD(HRESULT, SetCustomSink, (IMFMediaSink * pMediaSink));
+
+  // IMFCaptureRecordSink
+  MOCK_METHOD(HRESULT, GetRotation,
+              (DWORD dwStreamIndex, DWORD* pdwRotationValue));
+
+  // IMFCaptureRecordSink
+  MOCK_METHOD(HRESULT, SetRotation,
+              (DWORD dwStreamIndex, DWORD dwRotationValue));
+
+  // IUnknown
+  STDMETHODIMP_(ULONG) AddRef() { return InterlockedIncrement(&ref_); }
+
+  // IUnknown
+  STDMETHODIMP_(ULONG) Release() {
+    LONG ref = InterlockedDecrement(&ref_);
+    if (ref == 0) {
+      delete this;
+    }
+    return ref;
+  }
+
+  // IUnknown
+  STDMETHODIMP_(HRESULT) QueryInterface(const IID& riid, void** ppv) {
+    *ppv = nullptr;
+
+    if (riid == IID_IMFCaptureRecordSink) {
+      *ppv = static_cast<IMFCaptureRecordSink*>(this);
+      ((IUnknown*)*ppv)->AddRef();
+      return S_OK;
+    }
+
+    return E_NOINTERFACE;
+  }
+
+ private:
+  ~MockCaptureRecordSink() = default;
+  volatile ULONG ref_ = 0;
+};
+
+class MockCapturePhotoSink : public IMFCapturePhotoSink {
+ public:
+  // IMFCaptureSink
+  MOCK_METHOD(HRESULT, GetOutputMediaType,
+              (DWORD dwSinkStreamIndex, IMFMediaType** ppMediaType));
+
+  // IMFCaptureSink
+  MOCK_METHOD(HRESULT, GetService,
+              (DWORD dwSinkStreamIndex, REFGUID rguidService, REFIID riid,
+               IUnknown** ppUnknown));
+
+  // IMFCaptureSink
+  MOCK_METHOD(HRESULT, AddStream,
+              (DWORD dwSourceStreamIndex, IMFMediaType* pMediaType,
+               IMFAttributes* pAttributes, DWORD* pdwSinkStreamIndex));
+
+  // IMFCaptureSink
+  MOCK_METHOD(HRESULT, Prepare, ());
+
+  // IMFCaptureSink
+  MOCK_METHOD(HRESULT, RemoveAllStreams, ());
+
+  // IMFCapturePhotoSink
+  MOCK_METHOD(HRESULT, SetOutputFileName, (LPCWSTR fileName));
+
+  // IMFCapturePhotoSink
+  MOCK_METHOD(HRESULT, SetSampleCallback,
+              (IMFCaptureEngineOnSampleCallback * pCallback));
+
+  // IMFCapturePhotoSink
+  MOCK_METHOD(HRESULT, SetOutputByteStream, (IMFByteStream * pByteStream));
+
+  // IUnknown
+  STDMETHODIMP_(ULONG) AddRef() { return InterlockedIncrement(&ref_); }
+
+  // IUnknown
+  STDMETHODIMP_(ULONG) Release() {
+    LONG ref = InterlockedDecrement(&ref_);
+    if (ref == 0) {
+      delete this;
+    }
+    return ref;
+  }
+
+  // IUnknown
+  STDMETHODIMP_(HRESULT) QueryInterface(const IID& riid, void** ppv) {
+    *ppv = nullptr;
+
+    if (riid == IID_IMFCapturePhotoSink) {
+      *ppv = static_cast<IMFCapturePhotoSink*>(this);
+      ((IUnknown*)*ppv)->AddRef();
+      return S_OK;
+    }
+
+    return E_NOINTERFACE;
+  }
+
+ private:
+  ~MockCapturePhotoSink() = default;
+  volatile ULONG ref_ = 0;
+};
+
+template <class T>
+class FakeIMFAttributesBase : public T {
+  static_assert(std::is_base_of<IMFAttributes, T>::value,
+                "I must inherit from IMFAttributes");
+
+  // IIMFAttributes
+  HRESULT GetItem(REFGUID guidKey, PROPVARIANT* pValue) override {
+    return E_NOTIMPL;
+  }
+
+  // IIMFAttributes
+  HRESULT GetItemType(REFGUID guidKey, MF_ATTRIBUTE_TYPE* pType) override {
+    return E_NOTIMPL;
+  }
+
+  // IIMFAttributes
+  HRESULT CompareItem(REFGUID guidKey, REFPROPVARIANT Value,
+                      BOOL* pbResult) override {
+    return E_NOTIMPL;
+  }
+
+  // IIMFAttributes
+  HRESULT Compare(IMFAttributes* pTheirs, MF_ATTRIBUTES_MATCH_TYPE MatchType,
+                  BOOL* pbResult) override {
+    return E_NOTIMPL;
+  }
+
+  // IIMFAttributes
+  HRESULT GetUINT32(REFGUID guidKey, UINT32* punValue) override {
+    return E_NOTIMPL;
+  }
+
+  // IIMFAttributes
+  HRESULT GetUINT64(REFGUID guidKey, UINT64* punValue) override {
+    return E_NOTIMPL;
+  }
+
+  // IIMFAttributes
+  HRESULT GetDouble(REFGUID guidKey, double* pfValue) override {
+    return E_NOTIMPL;
+  }
+
+  // IIMFAttributes
+  HRESULT GetGUID(REFGUID guidKey, GUID* pguidValue) override {
+    return E_NOTIMPL;
+  }
+
+  // IIMFAttributes
+  HRESULT GetStringLength(REFGUID guidKey, UINT32* pcchLength) override {
+    return E_NOTIMPL;
+  }
+
+  // IIMFAttributes
+  HRESULT GetString(REFGUID guidKey, LPWSTR pwszValue, UINT32 cchBufSize,
+                    UINT32* pcchLength) override {
+    return E_NOTIMPL;
+  }
+
+  // IIMFAttributes
+  HRESULT GetAllocatedString(REFGUID guidKey, LPWSTR* ppwszValue,
+                             UINT32* pcchLength) override {
+    return E_NOTIMPL;
+  }
+
+  // IIMFAttributes
+  HRESULT GetBlobSize(REFGUID guidKey, UINT32* pcbBlobSize) override {
+    return E_NOTIMPL;
+  }
+
+  // IIMFAttributes
+  HRESULT GetBlob(REFGUID guidKey, UINT8* pBuf, UINT32 cbBufSize,
+                  UINT32* pcbBlobSize) override {
+    return E_NOTIMPL;
+  }
+
+  // IIMFAttributes
+  HRESULT GetAllocatedBlob(REFGUID guidKey, UINT8** ppBuf,
+                           UINT32* pcbSize) override {
+    return E_NOTIMPL;
+  }
+
+  // IIMFAttributes
+  HRESULT GetUnknown(REFGUID guidKey, REFIID riid,
+                     __RPC__deref_out_opt LPVOID* ppv) override {
+    return E_NOTIMPL;
+  }
+
+  // IIMFAttributes
+  HRESULT SetItem(REFGUID guidKey, REFPROPVARIANT Value) override {
+    return E_NOTIMPL;
+  }
+
+  // IIMFAttributes
+  HRESULT DeleteItem(REFGUID guidKey) override { return E_NOTIMPL; }
+
+  // IIMFAttributes
+  HRESULT DeleteAllItems(void) override { return E_NOTIMPL; }
+
+  // IIMFAttributes
+  HRESULT SetUINT32(REFGUID guidKey, UINT32 unValue) override {
+    return E_NOTIMPL;
+  }
+
+  // IIMFAttributes
+  HRESULT SetUINT64(REFGUID guidKey, UINT64 unValue) override {
+    return E_NOTIMPL;
+  }
+
+  // IIMFAttributes
+  HRESULT SetDouble(REFGUID guidKey, double fValue) override {
+    return E_NOTIMPL;
+  }
+
+  // IIMFAttributes
+  HRESULT SetGUID(REFGUID guidKey, REFGUID guidValue) override {
+    return E_NOTIMPL;
+  }
+
+  // IIMFAttributes
+  HRESULT SetString(REFGUID guidKey, LPCWSTR wszValue) override {
+    return E_NOTIMPL;
+  }
+
+  // IIMFAttributes
+  HRESULT SetBlob(REFGUID guidKey, const UINT8* pBuf,
+                  UINT32 cbBufSize) override {
+    return E_NOTIMPL;
+  }
+
+  // IIMFAttributes
+  HRESULT SetUnknown(REFGUID guidKey, IUnknown* pUnknown) override {
+    return E_NOTIMPL;
+  }
+
+  // IIMFAttributes
+  HRESULT LockStore(void) override { return E_NOTIMPL; }
+
+  // IIMFAttributes
+  HRESULT UnlockStore(void) override { return E_NOTIMPL; }
+
+  // IIMFAttributes
+  HRESULT GetCount(UINT32* pcItems) override { return E_NOTIMPL; }
+
+  // IIMFAttributes
+  HRESULT GetItemByIndex(UINT32 unIndex, GUID* pguidKey,
+                         PROPVARIANT* pValue) override {
+    return E_NOTIMPL;
+  }
+
+  // IIMFAttributes
+  HRESULT CopyAllItems(IMFAttributes* pDest) override { return E_NOTIMPL; }
+};
+
+class FakeMediaType : public FakeIMFAttributesBase<IMFMediaType> {
+ public:
+  FakeMediaType(GUID major_type, GUID sub_type, int width, int height)
+      : major_type_(major_type),
+        sub_type_(sub_type),
+        width_(width),
+        height_(height){};
+
+  // IMFAttributes
+  HRESULT GetUINT64(REFGUID key, UINT64* value) override {
+    if (key == MF_MT_FRAME_SIZE) {
+      *value = (int64_t)width_ << 32 | (int64_t)height_;
+      return S_OK;
+    } else if (key == MF_MT_FRAME_RATE) {
+      *value = (int64_t)frame_rate_ << 32 | 1;
+      return S_OK;
+    }
+    return E_FAIL;
+  };
+
+  // IMFAttributes
+  HRESULT GetGUID(REFGUID key, GUID* value) override {
+    if (key == MF_MT_MAJOR_TYPE) {
+      *value = major_type_;
+      return S_OK;
+    } else if (key == MF_MT_SUBTYPE) {
+      *value = sub_type_;
+      return S_OK;
+    }
+    return E_FAIL;
+  }
+
+  // IIMFAttributes
+  HRESULT CopyAllItems(IMFAttributes* pDest) override {
+    pDest->SetUINT64(MF_MT_FRAME_SIZE,
+                     (int64_t)width_ << 32 | (int64_t)height_);
+    pDest->SetUINT64(MF_MT_FRAME_RATE, (int64_t)frame_rate_ << 32 | 1);
+    pDest->SetGUID(MF_MT_MAJOR_TYPE, major_type_);
+    pDest->SetGUID(MF_MT_SUBTYPE, sub_type_);
+    return S_OK;
+  }
+
+  // IMFMediaType
+  HRESULT STDMETHODCALLTYPE GetMajorType(GUID* pguidMajorType) override {
+    return E_NOTIMPL;
+  };
+
+  // IMFMediaType
+  HRESULT STDMETHODCALLTYPE IsCompressedFormat(BOOL* pfCompressed) override {
+    return E_NOTIMPL;
+  }
+
+  // IMFMediaType
+  HRESULT STDMETHODCALLTYPE IsEqual(IMFMediaType* pIMediaType,
+                                    DWORD* pdwFlags) override {
+    return E_NOTIMPL;
+  }
+
+  // IMFMediaType
+  HRESULT STDMETHODCALLTYPE GetRepresentation(
+      GUID guidRepresentation, LPVOID* ppvRepresentation) override {
+    return E_NOTIMPL;
+  }
+
+  // IMFMediaType
+  HRESULT STDMETHODCALLTYPE FreeRepresentation(
+      GUID guidRepresentation, LPVOID pvRepresentation) override {
+    return E_NOTIMPL;
+  }
+
+  // IUnknown
+  STDMETHODIMP_(ULONG) AddRef() { return InterlockedIncrement(&ref_); }
+
+  // IUnknown
+  STDMETHODIMP_(ULONG) Release() {
+    LONG ref = InterlockedDecrement(&ref_);
+    if (ref == 0) {
+      delete this;
+    }
+    return ref;
+  }
+
+  // IUnknown
+  STDMETHODIMP_(HRESULT) QueryInterface(const IID& riid, void** ppv) {
+    *ppv = nullptr;
+
+    if (riid == IID_IMFMediaType) {
+      *ppv = static_cast<IMFMediaType*>(this);
+      ((IUnknown*)*ppv)->AddRef();
+      return S_OK;
+    }
+
+    return E_NOINTERFACE;
+  }
+
+ private:
+  ~FakeMediaType() = default;
+  volatile ULONG ref_ = 0;
+  const GUID major_type_;
+  const GUID sub_type_;
+  const int width_;
+  const int height_;
+  const int frame_rate_ = 30;
+};
+
+class MockCaptureEngine : public IMFCaptureEngine {
+ public:
+  MockCaptureEngine() {
+    ON_CALL(*this, Initialize)
+        .WillByDefault([this](IMFCaptureEngineOnEventCallback* callback,
+                              IMFAttributes* attributes, IUnknown* audioSource,
+                              IUnknown* videoSource) -> HRESULT {
+          EXPECT_TRUE(callback);
+          EXPECT_TRUE(attributes);
+          EXPECT_TRUE(videoSource);
+          // audioSource is allowed to be nullptr;
+          callback_ = callback;
+          videoSource_ = reinterpret_cast<IMFMediaSource*>(videoSource);
+          audioSource_ = reinterpret_cast<IMFMediaSource*>(audioSource);
+          initialized_ = true;
+          return S_OK;
+        });
+  };
+
+  virtual ~MockCaptureEngine() = default;
+
+  MOCK_METHOD(HRESULT, Initialize,
+              (IMFCaptureEngineOnEventCallback * callback,
+               IMFAttributes* attributes, IUnknown* audioSource,
+               IUnknown* videoSource));
+  MOCK_METHOD(HRESULT, StartPreview, ());
+  MOCK_METHOD(HRESULT, StopPreview, ());
+  MOCK_METHOD(HRESULT, StartRecord, ());
+  MOCK_METHOD(HRESULT, StopRecord,
+              (BOOL finalize, BOOL flushUnprocessedSamples));
+  MOCK_METHOD(HRESULT, TakePhoto, ());
+  MOCK_METHOD(HRESULT, GetSink,
+              (MF_CAPTURE_ENGINE_SINK_TYPE type, IMFCaptureSink** sink));
+  MOCK_METHOD(HRESULT, GetSource, (IMFCaptureSource * *ppSource));
+
+  // IUnknown
+  STDMETHODIMP_(ULONG) AddRef() { return InterlockedIncrement(&ref_); }
+
+  // IUnknown
+  STDMETHODIMP_(ULONG) Release() {
+    LONG ref = InterlockedDecrement(&ref_);
+    if (ref == 0) {
+      delete this;
+    }
+    return ref;
+  }
+
+  // IUnknown
+  STDMETHODIMP_(HRESULT) QueryInterface(const IID& riid, void** ppv) {
+    *ppv = nullptr;
+
+    if (riid == IID_IMFCaptureEngine) {
+      *ppv = static_cast<IMFCaptureEngine*>(this);
+      ((IUnknown*)*ppv)->AddRef();
+      return S_OK;
+    }
+
+    return E_NOINTERFACE;
+  }
+
+  void CreateFakeEvent(HRESULT hrStatus, GUID event_type) {
+    EXPECT_TRUE(initialized_);
+    ComPtr<IMFMediaEvent> event;
+    MFCreateMediaEvent(MEExtendedType, event_type, hrStatus, nullptr, &event);
+    if (callback_) {
+      callback_->OnEvent(event.Get());
+    }
+  }
+
+  ComPtr<IMFCaptureEngineOnEventCallback> callback_;
+  ComPtr<IMFMediaSource> videoSource_;
+  ComPtr<IMFMediaSource> audioSource_;
+  volatile ULONG ref_ = 0;
+  bool initialized_ = false;
+};
+
+#define MOCK_DEVICE_ID "mock_device_id"
+#define MOCK_CAMERA_NAME "mock_camera_name <" MOCK_DEVICE_ID ">"
+#define MOCK_INVALID_CAMERA_NAME "invalid_camera_name"
+
+}  // namespace
+}  // namespace test
+}  // namespace camera_windows
+
+#endif  // PACKAGES_CAMERA_CAMERA_WINDOWS_WINDOWS_TEST_MOCKS_H_
diff --git a/packages/camera/camera_windows/windows/texture_handler.cpp b/packages/camera/camera_windows/windows/texture_handler.cpp
new file mode 100644
index 0000000..a7c9473
--- /dev/null
+++ b/packages/camera/camera_windows/windows/texture_handler.cpp
@@ -0,0 +1,144 @@
+// Copyright 2013 The Flutter Authors. All rights reserved.
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+#include "texture_handler.h"
+
+#include <cassert>
+
+namespace camera_windows {
+
+TextureHandler::~TextureHandler() {
+  // Texture might still be processed while destructor is called.
+  // Lock mutex for safe destruction
+  const std::lock_guard<std::mutex> lock(buffer_mutex_);
+  if (texture_registrar_ && texture_id_ > 0) {
+    texture_registrar_->UnregisterTexture(texture_id_);
+  }
+  texture_id_ = -1;
+  texture_ = nullptr;
+  texture_registrar_ = nullptr;
+}
+
+int64_t TextureHandler::RegisterTexture() {
+  if (!texture_registrar_) {
+    return -1;
+  }
+
+  // Create flutter desktop pixelbuffer texture;
+  texture_ =
+      std::make_unique<flutter::TextureVariant>(flutter::PixelBufferTexture(
+          [this](size_t width,
+                 size_t height) -> const FlutterDesktopPixelBuffer* {
+            return this->ConvertPixelBufferForFlutter(width, height);
+          }));
+
+  texture_id_ = texture_registrar_->RegisterTexture(texture_.get());
+  return texture_id_;
+}
+
+bool TextureHandler::UpdateBuffer(uint8_t* data, uint32_t data_length) {
+  // Scoped lock guard.
+  {
+    const std::lock_guard<std::mutex> lock(buffer_mutex_);
+    if (!TextureRegistered()) {
+      return false;
+    }
+
+    if (source_buffer_.size() != data_length) {
+      // Update source buffer size.
+      source_buffer_.resize(data_length);
+    }
+    std::copy(data, data + data_length, source_buffer_.data());
+  }
+  OnBufferUpdated();
+  return true;
+};
+
+// Marks texture frame available after buffer is updated.
+void TextureHandler::OnBufferUpdated() {
+  if (TextureRegistered()) {
+    texture_registrar_->MarkTextureFrameAvailable(texture_id_);
+  }
+}
+
+const FlutterDesktopPixelBuffer* TextureHandler::ConvertPixelBufferForFlutter(
+    size_t target_width, size_t target_height) {
+  // TODO: optimize image processing size by adjusting capture size
+  // dynamically to match target_width and target_height.
+  // If target size changes, create new media type for preview and set new
+  // target framesize to MF_MT_FRAME_SIZE attribute.
+  // Size should be kept inside requested resolution preset.
+  // Update output media type with IMFCaptureSink2::SetOutputMediaType method
+  // call and implement IMFCaptureEngineOnSampleCallback2::OnSynchronizedEvent
+  // to detect size changes.
+
+  // Lock buffer mutex to protect texture processing
+  std::unique_lock<std::mutex> buffer_lock(buffer_mutex_);
+  if (!TextureRegistered()) {
+    return nullptr;
+  }
+
+  const uint32_t bytes_per_pixel = 4;
+  const uint32_t pixels_total = preview_frame_width_ * preview_frame_height_;
+  const uint32_t data_size = pixels_total * bytes_per_pixel;
+  if (data_size > 0 && source_buffer_.size() == data_size) {
+    if (dest_buffer_.size() != data_size) {
+      dest_buffer_.resize(data_size);
+    }
+
+    // Map buffers to structs for easier conversion.
+    MFVideoFormatRGB32Pixel* src =
+        reinterpret_cast<MFVideoFormatRGB32Pixel*>(source_buffer_.data());
+    FlutterDesktopPixel* dst =
+        reinterpret_cast<FlutterDesktopPixel*>(dest_buffer_.data());
+
+    for (uint32_t y = 0; y < preview_frame_height_; y++) {
+      for (uint32_t x = 0; x < preview_frame_width_; x++) {
+        uint32_t sp = (y * preview_frame_width_) + x;
+        if (mirror_preview_) {
+          // Software mirror mode.
+          // IMFCapturePreviewSink also has the SetMirrorState setting,
+          // but if enabled, samples will not be processed.
+
+          // Calculates mirrored pixel position.
+          uint32_t tp =
+              (y * preview_frame_width_) + ((preview_frame_width_ - 1) - x);
+          dst[tp].r = src[sp].r;
+          dst[tp].g = src[sp].g;
+          dst[tp].b = src[sp].b;
+          dst[tp].a = 255;
+        } else {
+          dst[sp].r = src[sp].r;
+          dst[sp].g = src[sp].g;
+          dst[sp].b = src[sp].b;
+          dst[sp].a = 255;
+        }
+      }
+    }
+
+    if (!flutter_desktop_pixel_buffer_) {
+      flutter_desktop_pixel_buffer_ =
+          std::make_unique<FlutterDesktopPixelBuffer>();
+
+      // Unlocks mutex after texture is processed.
+      flutter_desktop_pixel_buffer_->release_callback =
+          [](void* release_context) {
+            auto mutex = reinterpret_cast<std::mutex*>(release_context);
+            mutex->unlock();
+          };
+    }
+
+    flutter_desktop_pixel_buffer_->buffer = dest_buffer_.data();
+    flutter_desktop_pixel_buffer_->width = preview_frame_width_;
+    flutter_desktop_pixel_buffer_->height = preview_frame_height_;
+
+    // Releases unique_lock and set mutex pointer for release context.
+    flutter_desktop_pixel_buffer_->release_context = buffer_lock.release();
+
+    return flutter_desktop_pixel_buffer_.get();
+  }
+  return nullptr;
+}
+
+}  // namespace camera_windows
diff --git a/packages/camera/camera_windows/windows/texture_handler.h b/packages/camera/camera_windows/windows/texture_handler.h
new file mode 100644
index 0000000..b85611c
--- /dev/null
+++ b/packages/camera/camera_windows/windows/texture_handler.h
@@ -0,0 +1,91 @@
+// Copyright 2013 The Flutter Authors. All rights reserved.
+// Use of this source code is governed by a BSD-style license that can be
+// found in the LICENSE file.
+
+#ifndef PACKAGES_CAMERA_CAMERA_WINDOWS_WINDOWS_TEXTURE_HANDLER_H_
+#define PACKAGES_CAMERA_CAMERA_WINDOWS_WINDOWS_TEXTURE_HANDLER_H_
+
+#include <flutter/texture_registrar.h>
+
+#include <memory>
+#include <mutex>
+#include <string>
+
+namespace camera_windows {
+
+// Describes flutter desktop pixelbuffers pixel data order.
+struct FlutterDesktopPixel {
+  uint8_t r = 0;
+  uint8_t g = 0;
+  uint8_t b = 0;
+  uint8_t a = 0;
+};
+
+// Describes MFVideoFormat_RGB32 data order.
+struct MFVideoFormatRGB32Pixel {
+  uint8_t b = 0;
+  uint8_t g = 0;
+  uint8_t r = 0;
+  uint8_t x = 0;
+};
+
+// Handles the registration of Flutter textures, pixel buffers, and the
+// conversion of texture formats.
+class TextureHandler {
+ public:
+  TextureHandler(flutter::TextureRegistrar* texture_registrar)
+      : texture_registrar_(texture_registrar) {}
+  virtual ~TextureHandler();
+
+  // Prevent copying.
+  TextureHandler(TextureHandler const&) = delete;
+  TextureHandler& operator=(TextureHandler const&) = delete;
+
+  // Updates source data buffer with given data.
+  bool UpdateBuffer(uint8_t* data, uint32_t data_length);
+
+  // Registers texture and updates given texture_id pointer value.
+  int64_t RegisterTexture();
+
+  // Updates current preview texture size.
+  void UpdateTextureSize(uint32_t width, uint32_t height) {
+    preview_frame_width_ = width;
+    preview_frame_height_ = height;
+  }
+
+  // Sets software mirror state.
+  void SetMirrorPreviewState(bool mirror) { mirror_preview_ = mirror; }
+
+ private:
+  // Informs flutter texture registrar of updated texture.
+  void OnBufferUpdated();
+
+  // Converts local pixel buffer to flutter pixel buffer.
+  const FlutterDesktopPixelBuffer* ConvertPixelBufferForFlutter(size_t width,
+                                                                size_t height);
+
+  // Checks if texture registrar, texture id and texture are available.
+  bool TextureRegistered() {
+    return texture_registrar_ && texture_ && texture_id_ > -1;
+  }
+
+  bool mirror_preview_ = true;
+  int64_t texture_id_ = -1;
+  uint32_t bytes_per_pixel_ = 4;
+  uint32_t source_buffer_size_ = 0;
+  uint32_t preview_frame_width_ = 0;
+  uint32_t preview_frame_height_ = 0;
+
+  std::vector<uint8_t> source_buffer_;
+  std::vector<uint8_t> dest_buffer_;
+  std::unique_ptr<flutter::TextureVariant> texture_;
+  std::unique_ptr<FlutterDesktopPixelBuffer> flutter_desktop_pixel_buffer_ =
+      nullptr;
+  flutter::TextureRegistrar* texture_registrar_ = nullptr;
+
+  std::mutex buffer_mutex_;
+};
+
+}  // namespace camera_windows
+
+#endif  // PACKAGES_CAMERA_CAMERA_WINDOWS_WINDOWS_TEXTURE_HANDLER_H_