<?xml version="1.0" encoding="utf-8" ?> 
		    <rss version="2.0"> 
			<channel> 
			    <title>Tutorials | Personal View news aggregator</title> 
				<link>http://www.personal-view.com/news/tutorials</link> 
				<description></description><item>
			<title>Budget Filmmaking Tools You Can Get at Your Local Hardware Store</title>
			<link>http://www.personal-view.com/news/articles/budget-filmmaking-tools-you-can-get-at-your-local-hardware-store</link>
			<description><![CDATA[Who doesn’t love a good DIY filmmaking tool? One of my fondest memories from my time in film school was going to a Home Depot to get the parts and tools I needed to make a DIY camera gimbal with my dad. We had found a tutorial online that laid out how to make a simple weight-balanced gimbal rig out of PVC pipes, and it was one of the most rewarding afternoon projects of my life.It also produced a very useful and pragmatic gimbal, which I used for years with my DSLR setup. Projects like these can not only be fun but also provide a creative spark to your filmmaking routines.In the same spirit, we really dug this video on budget filmmaking gear by YouTube filmmaker Keaton Nye, which goes over the gear he uses from a hardware store. Let’s check it out.Hardware Store Budget GearIf you’re looking for cheap, yet practical, DIY gear to possibly consider using for your filmmaking projects, then this is a great video to check out overall, as it goes over lots of details on how Nye has built his go-to filmmaking toolset.
Here’s a breakdown of the tools featured in the video:
0:00 Intro0:45 4-in-1 Screwdriver1:21 Furni Pads2:12 Electronics Repair Kit3:44 Ammo Crates4:20 Poly Cases5:00 Honorable Mentions
In the video, Nye focuses on the tools he’s gotten from Harbor Freight, which he uses regularly on his filmmaking projects. These tools are all items that you can purchase and start using right away, and can be found either at Harbor Freight, in particular, or other hardware stores, or online.More DIY Filmmaking ResourcesWe’ve covered tons of other helpful DIY filmmaking tricks, tips, and tools that you can consider using for your own projects in the past. Check out some more DIY filmmaking articles from the NFS archives below.]]></description>
			<pubDate>Fri, 30 Jan 2026 21:34:09 UTC</pubDate>
			</item><item>
			<title>Hollyland Solidcom M1 Pro Released – A Scalable 1.9 GHz Intercom for Medium-Scale Productions</title>
			<link>http://www.personal-view.com/news/articles/hollyland-solidcom-m1-pro-released-a-scalable-19-ghz-intercom-for-mediumscale-productions</link>
			<description><![CDATA[Hollyland has introduced the Solidcom M1 Pro, a full-duplex wireless intercom system designed for medium-scale productions such as concerts and exhibition venues. Available in four- and eight-user configurations, the system operates over the 1.9 GHz spectrum and supports cascade expansion up to 24 users, positioning it between small crew headset systems and larger wired broadcast infrastructures. Unlike all-in-one headset systems, the M1 Pro uses beltpacks paired with professional LEMO headsets, aligning it more closely with traditional broadcast-style intercom workflows. Let’s look at what the system offers!



Hollyland has been steadily expanding its lineup, recently introducing products like the Lyra 4K webcam, which we discussed in our November podcast, the VCore smartphone monitoring system and the Pyro 5 all-in-one wireless monitor. With the Solidcom M1 Pro, Hollyland shifts back to crew communication, targeting structured production teams rather than individual content creators. To see details on how it works, have a look at their tutorial:









1.9 GHz transmission and audio structure



The Solidcom M1 Pro operates over the 1.9 GHz spectrum, a band that’s generally less crowded than 2.4 GHz when you’re working in RF-heavy venues. In practical terms, that means fewer surprises when you’re working in venues already packed with wireless microphones, in-ear systems, and Wi-Fi traffic.



The system supports full-duplex communication for up to eight users. Hollyland also integrates automatic frequency hopping to maintain stable transmission if interference is detected.



On the audio side, the M1 Pro uses dual-microphone environmental noise cancellation and electret-condenser microphones. It features a 16 kHz sampling rate and a 200 Hz to 7 kHz frequency response, optimized for speech-focused communication. In practice, that aligns with typical speech-optimized intercom design rather than full-range audio transmission.



360-degree transmission in a radius of up to 985 ft (approximately 300m)



Range and antenna configuration



The M1 Pro base station includes a built-in panel antenna designed for directional coverage. According to Hollyland, the panel antenna supports transmission up to 1,300 feet (approximately 396 meters) in front of the base station and 160 feet (about 49 meters) behind it in unobstructed environments.



For broader coverage, the system also ships with two external antennas that provide 360-degree transmission in a radius of up to 985 feet (approximately 300 meters). This configuration is intended for larger rooms or venues where crew members are moving continuously rather than working within a fixed frontal field.



Image credit: Hollyland



Expansion and integration options



While the M1 Pro launches in four- and eight-user configurations, the system can be expanded further through cascade connection. Two base stations can be linked using a standard network cable up to 328 feet (100 meters) or via XLR, allowing support for up to 24 headsets simultaneously.



The base station includes multiple interface options, including 2-wire XLR and 4-wire RJ45 connections, along with PoE/LAN power input. This allows the M1 Pro to integrate into existing wired intercom infrastructures rather than operate solely as a standalone system.



Image credit: Hollyland



Grouping can be managed directly from the base station, the Solidcom app, or a private web interface, allowing beltpacks to be assigned to separate teams such as lighting, camera, or production. Up to three groups can be configured within a single system, which should be enough for most mid-sized event workflows.



Main station. Image credit: Hollyland



Beltpack design and power management



The Solidcom M1 Pro uses beltpacks paired with wired headsets rather than an all-in-one design. Each beltpack weighs approximately 7 oz (about 200 g) and features an HD LCD for status and grouping information, along with three ergonomically differentiated function buttons to help prevent accidental presses in low-light conditions.



The beltpacks support both LEMO and standard 3.5 mm headset connections, allowing compatibility with third-party options. A sidetone function is included for quick microphone checks when working solo.



Power comes from two removable lithium-ion batteries per beltpack, each rated for up to six hours of runtime and charging in approximately 2.5 hours. A charging station is included with the four-pack kit, and the base station supports dual NP-F batteries as well as PoE power input (which means longer events can be covered with battery swaps rather than downtime for charging).



M1 Pro base station. Image credit: Hollyland



Pricing and availability



The Solidcom M1 Pro is available in both four- and eight-user configurations, with the choice of single-ear or dual-ear headsets.



Pricing is set at $4,799 for the four-user single-ear package and $4,999 for the four-user dual-ear version. The eight-user configuration is priced at $7,699 for single-ear headsets and $7,999 for the dual-ear package.



Each kit includes beltpacks, wired headsets, a charging station, and the base station. Cascade capability allows expansion beyond eight users when additional base stations are added.



For more information, please see the Hollyland website. 



For teams working in medium-sized venues, does the M1 Pro strike the right balance between flexibility and structure? Would you opt for a scalable 1.9 GHz wireless system like this, or stick with a traditional wired intercom setup for similar productions? Let us know in the comments!]]></description>
			<pubDate>Thu, 29 Jan 2026 09:02:39 UTC</pubDate>
			</item><item>
			<title>Accsoon CineView M7 Firmware Update Adds Sony EI Control, Nikon Support, ARRI-Style False Color, and Vertical Workflow</title>
			<link>http://www.personal-view.com/news/articles/accsoon-cineview-m7-firmware-update-adds-sony-ei-control-nikon-support-arristyle-false-color-and-vertical-workflow</link>
			<description><![CDATA[Accsoon has released another major firmware update for the CineView M7 and M7 Pro monitors, expanding camera control to include Nikon models for the first time while adding Sony EI adjustment, a new focus peaking tool, ARRI-style false color monitoring, vertical UI support, and significant DCI 4K workflow improvements.



The update continues Accsoon’s aggressive firmware development cycle for the M7 series, which has seen consistent feature additions since the monitors launched in July 2025. Following previous updates that introduced camera control for Sony and Canon mirrorless cameras and later added TX transmission mode for the M7 Pro, this release focuses on professional monitoring tools and broader camera compatibility.



Nikon camera control arrives



Perhaps the most significant addition is native camera control support for Nikon Z-mount cameras, marking the first time the CineView M7 series has extended beyond Sony and Canon ecosystems. The update brings imaging parameter control across multiple Nikon Z models including the Z5, Z50, Z6, Z6 II, Z7 II, Z8, and Z9.



Focus control via the monitor’s touchscreen interface is available on the flagship Z8 and Z9 models. Other supported Nikon cameras allow parameter adjustment but will disable video output during focus changes due to camera-side limitations. This is an important workflow consideration for shooters using lower-tier Nikon bodies who may need to plan around this constraint.



Accsoon CineView M7 Pro on set. Image credit: Accsoon



Expanded Sony support with EI control



Sony users gain two notable new capabilities with this update. The monitors now support EI (Exposure Index) and shutter angle adjustment on compatible Sony cameras, with automatic detection that syncs the M7’s parameter dial with whatever mode the camera is currently using. This intelligent sync should help operators avoid confusion when switching between different exposure controls.



The supported Sony camera list now includes the FX6 (currently in beta), FX3, FX30, A7 IV, A7s III, A9 III, and A7c II. Canon support remains unchanged with the R5, R5 II, R6, R6 II, R7, and R8 all compatible.



New peaking focus assist tool



Accsoon has introduced a second focus assistance option called Peaking, which takes a different approach than the existing color-overlay system (now renamed Focus). Rather than overlaying colored highlights on in-focus edges, the new Peaking tool strengthens the contrast of sharp edges directly on the original image. This makes it easier to identify the focal plane without the visual distraction of colored overlays.



According to Accsoon, the tool behaves similarly to focus assist functions found on higher-end professional monitors. The default Peaking strength is set at 30%, which the company says corresponds roughly to a value of 10 to 15 on other professional monitoring systems.



ARRI-style false color for cinema workflows



The Accsoon CineView M7 firmware update adds ARRI-style false color alongside the existing legacy false color mode. Users can now switch between the two depending on their preference or workflow requirements. The new monitoring-based ARRI-style implementation supports LogC3 and LogC4 recording formats, including compatibility with the ARRI Alexa 35 and Alexa 265.



EI selection is also included for improved exposure accuracy when working with ARRI log footage, giving operators familiar with ARRI’s exposure tools a more consistent monitoring experience across their camera and display systems.



Vertical UI and livestreaming support



Accompanying the release of Accsoon’s Triple Monitor Kit for the CineView M7 Pro, a vertical UI has been added to the monitoring page of the built-in Accsoon SEE App. This optimized layout is designed specifically for vertical content production, which has become increasingly important for social media and short-form video workflows.



Alongside Accsoon’s Triple Monitor Kit for the CineView M7 Pro, a vertical UI is now available. Image credit: Accsoon



Vertical livestreaming is now supported as well, with the encoder automatically matching the current viewing orientation. Rotation controls lock once streaming begins to prevent accidental orientation changes mid-broadcast.



DCI 4K workflow optimization



The update brings significant improvements for productions working with DCI 4K signals. Accsoon has integrated a new ratio conversion pipeline that intelligently handles both 17:9 DCI formats and 16:9 UHD/FHD formats. This use-case dependent approach should reduce the manual configuration previously required when switching between different aspect ratios.



Additional image processing improvements include an upgraded pipeline delivering smoother color transitions, cleaner edge definition, and more natural sensor noise rendering. A new interlaced signal processing pipeline has also been added, improving image quality from interlaced sources by reducing aliasing and smoothing edges.



Accsoon CineView M7 firmware update availability



The firmware update for the Accsoon CineView M7 and M7 Pro is available now as a free download from the Accsoon Download Center. Tutorial videos explaining the new features are available on the company’s official channels. The CineView M7 is priced at $799 while the M7 Pro with built-in wireless transmission retails at $899.



Accsoon continues to expand the CineView M7 series’ capabilities at an impressive pace. Have you integrated the M7 or M7 Pro into your monitoring setup, and which of these new features interests you most? Let us know in the comments below.]]></description>
			<pubDate>Thu, 22 Jan 2026 09:31:03 UTC</pubDate>
			</item><item>
			<title>Learn the Basic Functions of the Fujifilm GFX ETERNA 55 With Latest Hands-On Tutorial</title>
			<link>http://www.personal-view.com/news/articles/learn-the-basic-functions-of-the-fujifilm-gfx-eterna-55-with-latest-handson-tutorial</link>
			<description><![CDATA[Whether you’re raring to go with Fujifilm’s large format camera beast as soon as possible, or if you’re an aspirational fan of this exciting new cinema camera from the inspirational side of things, we’re not going to lie—this new hands-on tutorial series that Fujifilm is putting out is pretty awesome.While there are, of course, lots of great tutorials put out by different pros and influencers for all new cameras these days, to have a brand so thoroughly go over all of the basics that you need to know to get started with a camera is just very thoughtful, helpful, and appreciated.So, if you’re at all curious about the Fujifilm GFX ETERNA 55 and what it has to offer for your possible high-end cinematography needs, then you need to check out this latest tutorial that covers all of the basic functions that you need to know to get started.Fujifilm GFX ETERNA 55 Basic FunctionsAs its title explains, this hands-on video series dives into its second chapter as Michael from Fujifilm North America guides us through the basic functions of the camera and its menu structure. It’s a pretty straightforward video that goes over how first-time users can navigate the menu and the shortcut buttons.
Some of the highlights of the video will simply help interested DPs and filmmakers learn how to adjust frame rate, ISO, shutter angle, and ND filters. The tutorial also does a good job of covering all of the essential settings like format, date and time, and how to set a custom white balance.
All helpful tips to keep in mind when starting up with the GFX ETERNA 55 for the first time, and looking to not be overwhelmed by the amount of display options and controls, which you might have to figure out for yourself.The Fujifilm GFX ETERNA 55As we’ve covered since it was first announced, the Fujifilm GFX ETERNA 55 is an exciting new high-end large-format cinema camera that aims to bring Fujifilm’s vaunted photographic history to a video-centric form.Built around a new large-format sensor that covers Open Gate and a range of additional formats, the GFX ETERNA 55&#039;s array of features has been smartly designed for exceptional video capture and seamless integration with FUJIFILM G lenses.If you’re curious to check out the Fujifilm GFX ETERNA 55 yourself, we’re happy to report that it’s out and available now, and it&#039;s one of the most intriguing new cameras of the year. Here are its full specs and purchase options.Large-Format Sensor, Variable ND4K48 4:3 Open Gate, 6.3K24 Super 3514.5-Stop Dynamic Range, Dual-Native ISOInternal ProRes 422 HQ, H.265, ProxiesExt HDMI 10-Bit Uncompressed, 12-Bit Raw5&quot; 16:9, 3&quot; 3:2 Touchscreen LCD Screens20 Film Simulations, F-Log2/C 3D LUTsFUJIFILM G and ARRI PL Lens MountsEthernet, BLE with TG-BT1/ATOMOS AirGluCFexpress and SD Card Slots, USB-C PortFUJIFILM GFX ETERNA 55 Cinema CameraFUJIFILM brings its vaunted photographic history to a video-centric form with the GFX ETERNA 55 Cinema Camera.]]></description>
			<pubDate>Fri, 19 Dec 2025 20:00:03 UTC</pubDate>
			</item><item>
			<title>The More You KNOWLED: Here’s a Helpful Guide to Working With Godox’s New MG4K Light</title>
			<link>http://www.personal-view.com/news/articles/the-more-you-knowled-here’s-a-helpful-guide-to-working-with-godox’s-new-mg4k-light</link>
			<description><![CDATA[Announced as a next-gen full-color COB light that aimed to offer surprisingly bright and remarkable color when first announced, the Godox KNOWLED MG4K has quickly become a popular option for video crews of all sizes as a well-tailored option for modern filmmaking and videography trends.Part of Godox’s KNOWLED lineup, the MG4K stands out with its efficiency and ability to surpass many traditional lights in terms of brightness and ease of use. If you haven’t used a Godox KNOWLED in the past, or if you’re simply curious about how to actually get started with a light like the MG4K for the first time, the company has released a nice walkthrough tutorial that covers everything you need to know about the latest RGB LED monolighting in Godox’s popular KNOWLED lineup. Godox KNOWLED MG4K WalkthroughHosted by Davi Valente, this video tutorial is pretty straightforward but worth the watch—especially since it’s a lot of helpful information contained in an under-10-minute format. The video goes over what makes the Godox KNOWLED MG4K unique, and how it can specifically surpass traditional lights in brightness thanks to its exceptional efficiency.
It really comes down to the advanced light engine at the heart of the MG4K that can maximize optical efficiency, by outshining your normal 4K HMI lights with an impressive 124,000 lux at 5 meters (with a 15° reflector).
The MG4K is also quite notably lightweight and easy to use, as it weighs just 10kg and can operate on only 2000W instead of the usual 4000W, making it a nice option for those working with limited resources.Price and AvailabilityWe’ve covered the Godox KNOWLED MG4K in the past, and by Godox’s own testing, it’s been able to outperform traditional 4K HMI lights with its high-efficiency.
It’s also just a nice expansion of the KNOWLED MG line as it offers a design with G-mount compatibility along with its usual IP65-rated housing. The light also comes with a controller that is intentionally separate from the fixture to help further allow easy overhead placement, and it also provides a reliable and improved fan cooling system. 
Overall, if you’re looking for a new option to explore, the MG4K is developed as a cinema-quality LED that is well-balanced and easy to maximize your brightness and efficiency. Here are the full specs and purchase options.Godox KNOWLED MG4K Bi-Color LED MonolightEmerging as a smaller, yet still effective point-source alternative to the MG6K, Godox brings the KNOWLED MG4K Bi-Color LED Monolight to the market with 2000W of power, advanced features, and various control and modifier options for high-end sets in need of quality image creation.]]></description>
			<pubDate>Mon, 15 Dec 2025 17:55:03 UTC</pubDate>
			</item><item>
			<title>How the &#039;Stranger Things&#039; Title Sequence Was Made—Straight From the Artist Who Created It</title>
			<link>http://www.personal-view.com/news/articles/how-the-stranger-things-title-sequence-was-made—straight-from-the-artist-who-created-it</link>
			<description><![CDATA[Well, this is cool. Eric Demeusy, the filmmaker and motion effects artist, has just shared a complete (like, comprehensive complete) breakdown of his entire process creating the original Stranger Things title sequence in Adobe After Effects.You know, probably the most iconic part of the Stranger Things franchise is currently the most popular piece of media in the world right now. And it was all edited in a program that you can boot up and start creating with quite easily here today.Let’s check out this tutorial and explore what his process was, and how you too can create iconic and awesome title sequences and animations of your own.The Original Stranger Things Title SequenceBefore we dive into Demeusy’s tutorial, let’s take a stroll down Memory Lane as we look at the original Stranger Things title sequence, which was uploaded on YouTube when the show first debuted nine years ago. (Let that sink in for a moment…)As you can see, the original title sequence is still as iconic and simple as it seemed then. Proof that you don’t need a million-dollar budget and a team of animators to create something cinematic and effective.And, if you look at the most recent title sequence for season 5 of Stranger Things, it hasn’t changed much as the series continues to use pretty much the exact same style and format as Demeusy originally created.Stranger Things Title Sequence BreakdownNow, moving on to the fun stuff, here’s the full video breakdown uploaded by Eric Demeusy on his YouTube channel. The video, which is nearly an hour long, goes in-depth, exploring his entire process of animating the title sequence back in 2025 inside Adobe After Effects.The full tutorial does a nice job of chopping itself up to go over some of the fundamental elements of the iconic title sequence, which breaks down into the following sections:The Main Title lock upClose up letter form shotsTitle fadesOverall film grain, flicker, and gateweaveFinal compositing &amp; finishingThe goal here, as stated by Demeusy, is to share as much advice and insights as possible to inspire any other up-and-coming motion designers or filmmakers to unlock the tools they need for their craft. If you’re curious to follow along more thoroughly, here are some individual sections with timestamps to check out:00:00 Intro01:05 Setting up the Logo02:19 Animation09:16 Creating the Look23:54 Close up Letters41:40 Title Cards45:10 Making it Gritty51:53 RecapWhat It Was Like Animating the Stranger Things Title SequenceSince publishing the full tutorial breaking down his animation process for the original Stranger Things title sequence, Demeusy has actually followed up that video with another companion piece that further explores his process with some different insights and pieces of advice.A bit shorter in length, this companion video is a great watch either before or after the longer version. (It also might be an easier one for you to share with any friends or colleagues who might be interested in this process but likely don’t have an hour to watch through the longer video.)If you’re enjoying this video, you can further jump into different sections with these timestamps below.00:00 Intro00:23 The Origin02:03 How I Started04:03 Getting the Look Right07:20 The Influence and Research10:33 Retrospect, Purpose, and What We Created13:56 OutroOverall, it’s both an impressive feat and an inspiring story that showcases how even the simplest of prompts and projects can become such major hits with huge cultural significance. So, as you go on your filmmaking and video editing journeys, keep your eyes out for helpful insights and inspiration from those who have advice to share on how they made their marks.]]></description>
			<pubDate>Mon, 15 Dec 2025 17:36:42 UTC</pubDate>
			</item><item>
			<title>Vertiscope Explained: How to Shoot Vertical Anamorphic on Any Mirrorless or PL-Mount Camera</title>
			<link>http://www.personal-view.com/news/articles/vertiscope-explained-how-to-shoot-vertical-anamorphic-on-any-mirrorless-or-plmount-camera</link>
			<description><![CDATA[We know, this is a sacrilegious topic for most cinematographers, but love it or hate it, vertical format video is here. It’s popular, it’s social, and it can help pay the bills. Or maybe you’re just a psycho, and you love shooting vertical—who knows!Regardless of how you feel about the format and the concept of vertical anamorphic (aka Vertiscope), we do have a really cool and helpful guide to share with y’all that covers everything you need to know about shooting Vertiscope on any mirrorless or PL-mount camera.So, flip your camera 90 degrees and dive in below.How to Shoot VertiscopeShared with us by the team at Blazar, the company’s YouTube channel, Blazar Anamorphic Lenses, put out a new tutorial that goes over everything you need to know about shooting vertical anamorphic.
In the video, the team goes over everything you need to know to capture cinematic vertical frames using Blazar lenses, from setup and squeeze factors to framing, composition, and real-world examples.
Here’s the full breakdown of the topics covered in the video with timestamps:
0:10 Introducing Vertiscope with the Blazar Beetle1:17 How to Desqueeze with the Blazar App1:56 Unlocking Four New Aspect Ratios with the Blazar Beetle2:42 How to Shoot Vertiscope on Any Mirrorless or PL-Mount Camera3:34 Understanding Vertiscope Aspect Ratios4:58 Using the Blazar Anamorphic Calculator5:46 Solutions for On-Camera Anamorphic Desqueeze6:45 How to Calculate Squeeze Factor for Monitoring Vertiscope7:09 How to Desqueeze on a Hardware Level7:43 How to Desqueeze in Post-Production9:09 Final Thoughts: Why Shoot Vertiscope?
Overall, the video does a great job of explaining how Vertiscope works as well as addressing questions you might have about which squeeze factors to choose, how to frame and monitor anamorphic in a vertical workflow, and how to handle post-production desqueeze and mastering. 
You can find more info on Blazar’s anamorphic lenses, which range across PL, E, RF, and other mounts, as well as check out some of their other resources, like an Anamorphic Calculator on the company’s website here.]]></description>
			<pubDate>Mon, 08 Dec 2025 21:45:03 UTC</pubDate>
			</item><item>
			<title>Learn How to Pair Your Shinobi 7 RX and Atomos TX Together For Stable Wireless Monitoring</title>
			<link>http://www.personal-view.com/news/articles/learn-how-to-pair-your-shinobi-7-rx-and-atomos-tx-together-for-stable-wireless-monitoring</link>
			<description><![CDATA[Honestly, at this point, if you’re not using wireless video monitoring on your sets, you’re kind of doing yourself a disservice. Sure, you don&#039;t have to have a wireless video monitor. You could use the monitor or touchscreen on your camera. But in terms of ease-of-use and affordability, it’s really not that much extra effort or budget to get all the perks of wireless monitoring.If you’ve ever been on the fence but been intimidated by the workflow, or if you’d simply like to check out some different options, Atomos has put out a nice tutorial video covering how to work with the Shinobi 7 RX and the Atomos TX.Let’s look at this tutorial and share how you can get started with these options today.How to Monitor Wirelessly with the Shinobi 7 RXReleased on Atomos’ YouTube channel, this new video is all about pairing the Shinobi 7 RX with the Atomos TX to unlock stable wireless monitoring using built-in Wi-Fi and RX capabilities. The video tutorial goes over all the steps you need to follow with clear demonstrations and easy-to-follow commands.It’s really not rocket science and not too hard to figure out on your own. However, for all of us visual learners, it does scratch an itch to see these tasks being performed in front of you. (It’s actually kind of has a weird ASMR vibe that is quite relaxing.)Overall, it’s a simple by efficient workflow that can unlock some workflow-improving functionality for your sets.The Atomos Shinobi 7 RXIf you’re curious about getting started with the Shinobi 7 RX and using it for your projects and processes, here are some quick specs and purchase option info to check out.
7&quot; 1920 x 1080 Touchscreen HDR DisplayWi-Fi Camera Control, Focus, StreamingHD 3G-SDI &amp; 4K60 HDMI 2.0 Input &amp; Output2200 cd/m² BrightnessDual L-Series Battery Slots, USB-C PD InHDMI/SDI Cross ConversionAtomOS, 3D LUT Support, LANC Control10-Stop Dynamic RangeSupports a Variety of Log Formats, ToolsAtomos Shinobi 7 RX HDMI/SDI HDR MonitorAdd a bright, high-resolution on-camera monitor to your live production kit for your focus puller, director, or camera assistant with the Shinobi 7 RX HDMI/SDI HDR Monitor from Atomos.]]></description>
			<pubDate>Fri, 05 Dec 2025 16:37:19 UTC</pubDate>
			</item><item>
			<title>How to pair the Atomos TX with the SHINOBI 7 RX for wireless monitoring</title>
			<link>http://www.personal-view.com/news/articles/how-to-pair-the-atomos-tx-with-the-shinobi-7-rx-for-wireless-monitoring</link>
			<description><![CDATA[Atomos has released a tutorial video that shows you how to pair the Atomos TX with the SHINOBI 7 RX for wireless monitoring. Here are the steps: Connect and power on the Atomos TX to your camera Confirm that your video signal is displayed Power on the SHINOBI 7RX Navigate to the wireless menu on the … Continued
The post How to pair the Atomos TX with the SHINOBI 7 RX for wireless monitoring appeared first on Newsshooter.]]></description>
			<pubDate>Fri, 05 Dec 2025 01:47:11 UTC</pubDate>
			</item><item>
			<title>Learn How to Remotely Control Broadcast Setups With the Servo Link</title>
			<link>http://www.personal-view.com/news/articles/learn-how-to-remotely-control-broadcast-setups-with-the-servo-link</link>
			<description><![CDATA[If you’ve ever been curious about a workflow or a new tool, the honest-to-god best way to learn these days is usually to just go to YouTube and search for an answer. Yes, some of us are more text-based minded, but for the majority of the film and video population, I’m willing to bet visuals and hands-on would be easier.If you’ve ever been curious about remotely managing your broadcast setups with tools provided by Ignite Digi, then this is the in-depth tutorial for you. It covers everything you need to know about working with the Servo Link and how it can translate Tilta FIZ commands to industry-standard broadcast servo and cine servo lens control.Here’s what you need to know.Ignite Digi Servo Link TutorialHosted on the Ignite Digi, we get a very in-depth tutorial that showcases pretty much everything you might ever want to know about the intricacies of the Servo Link and how it can be seamlessly integrated within the Control Ecosystem.
What’s neat about this workflow is how it allows Control Deck and FIZBUSTER operators to remotely control Canon and Fujinon servo lenses via the long-range DJI Transmission signal. This works by allowing the Servo Link to translate Tilta FIZ commands to industry-standard broadcast servo and cine servo lens control.
Here’s a full breakdown of what is covered in the tutorial above.
00:00 Intro00:11 Why did we make Servo Link?00:53 How does Servo Link work?02:32 Hirose ports &amp; cables03:14 MōVI Pro rig breakdown04:41 Ronin 2 rig breakdown07:27 Control Deck08:55 Explaining zoom scaling10:44 Explaining focus curve13:11 FIZ handoff &amp; FIZBUSTER15:31 Use alongside CCU19:29 Explaining crash zoom20:18 Servo Link off gimbal21:49 Tilta Nucleus-M use cases28:01 Tilta Advanced Ring Grip29:20 eMotimo Conductor29:48 NODO Torq Head31:11 Thanks, wiki, &amp; wrap up
If you’d like to learn more about the Servo Link and what it offers, you can check it out on Ignite Digi’s website here.]]></description>
			<pubDate>Wed, 03 Dec 2025 21:51:32 UTC</pubDate>
			</item><item>
			<title>Tilta Khronos 17 Lite Setup Guide &amp; Overview</title>
			<link>http://www.personal-view.com/news/articles/tilta-khronos-17-lite-setup-guide-overview</link>
			<description><![CDATA[Tilta has posted up a nice setup and overview guide for its Khronos 17 Lite iPhone cases. This in-depth tutorial covers how to connect the Khronos Lite Control Handle via Bluetooth, allowing you to control key functions inside both the iPhone 17 native camera app and the Blackmagic Camera app. The Tilta Khronos 17 Lite … Continued
The post Tilta Khronos 17 Lite Setup Guide &amp; Overview appeared first on Newsshooter.]]></description>
			<pubDate>Fri, 28 Nov 2025 00:16:40 UTC</pubDate>
			</item><item>
			<title>DJI Osmo 360 Review and Hands-On – A Bikepacking Trip through Kyrgyzstan</title>
			<link>http://www.personal-view.com/news/articles/dji-osmo-360-review-and-handson-a-bikepacking-trip-through-kyrgyzstan</link>
			<description><![CDATA[This autumn, I fulfilled a huge dream of mine and went on a bikepacking trip into the wild nature of Kyrgyzstan. To say that it was an incredible adventure would be an understatement. Sometimes, it felt as unreal as if we had traveled to another planet. My husband and I are filmmakers, so we decided to film a short documentary along the way. Apart from a small mirrorless cam and a compact drone, we also took with us a DJI Osmo 360 and gave it a thorough test drive. How it survived our journey, what pictures we could capture with it, and what our genuine impression of the camera and post-production workflow was – read in a detailed Osmo 360 review below!



The Osmo 360 is DJI’s first-ever 360 camera. Generally, the Osmo series is known for strong performance in action and blogging, as well as professional documentary work. (As an example, watch Johnnie try out the DJI Osmo Pocket 3). Yet, entering the market segment, which already has a well-established leader, was a bold move. (I’m talking about Insta360, of course.) So, the company had to introduce some features that would make their product stand out. These include large square-shaped image sensors, 8K 50fps recording, and claimed 13.5 stops of dynamic range. (If you want to read about all of the technical specifications, head over here to our dedicated article.)



DJI Osmo 360 review: our field trip conditions



A description of DJI Osmo 360’s capabilities sounded like the perfect match for our use case. Let me give you a quick overview of our trip conditions, and you’ll know why. During our journey, we spent a considerable amount of time in the mountains, with temperatures ranging from 0 °C (32 °F) to +24 °C (75 °F). Precipitation-wise, we mostly got lucky, though we did have to push through hail and snow on one of the steepest ascents.




Some photo impressions from our bikepacking trip. Images credit: Imanuel Thallinger




















Our longest streak off-grid, with no access to civilization, electricity, or mobile reception, lasted for five consecutive days. The key from the start was simple: since we had to carry everything on our regular (not electric!) mountain bikes, it was essential to take as little as possible. We needed an action camera that was tough, adaptable, and light. The DJI Osmo 360, along with its accessories, seemed like the perfect fit. 



Our setup with the Osmo 360 for the tour



Our Osmo 360 setup for the tour included:




The Osmo 360 itself, with a soft pouch for carrying it when not in use.



Transparent lens protectors – we rarely used them. I preferred keeping the lenses uncovered in good, dry weather to avoid any loss of image quality.



Osmo Motorcycle Heavy-Duty Mount – mostly used to rig the camera on the handlebars, and it felt 100% secure.



Osmo Dual Heavy-Duty Clamp for further rigging options.



Osmo Hanging Neck Mount Max, which we applied for all the point-of-views, shot from the chest. (On an important note, if you want to use this accessory with the Osmo 360, you will need an action cam quick-release adapter mount, which DJI also offers but is sold separately.)



Osmo Action Multifunctional Battery Case with additional batteries. Luckily, DJI made Osmo 360 compatible with Action 5 batteries, and we had some spares in CineD’s storage. I’ll touch on battery life later, but without this case, we wouldn’t have been able to keep filming through the entire bikepacking trip.



Osmo 1.2m Invisible Selfie Stick with magnetic quick-release mount. The quick-release worked wonderfully, but the mount came loose and was lost during a long, bumpy downhill section, so I’d recommend removing it from the bike when not filming. 



DJI OM Grip Tripod – for all the stationary situations, like interviews, timelapses, and such.



2x Lexar SILVER PLUS 512 GB. Compatibility-wise, these microSD cards are among those specifically recommended by DJI. We managed to fill up both of them during 14 days (mostly filming on 360 8K mode, 25fps, in D-Log M). The DJI Osmo 360 also has 105GB of built-in storage. We didn’t have to use it, but it’s great to know that you have a safety net, just in case.




Image credit: Mascha Deikova / CineD



It looks like a lot, yet it actually made a pretty compact kit, which didn’t take up much space or add much weight compared to our camping gear and clothing. With this setup, we were able to cover all the shots and angles we wanted for our documentary, and almost everything worked out perfectly. (Below, we’ll also talk about what didn’t).



Rigging the DJI Osmo 360 on a bike



Naturally, most of our time on the trip was spent on the bikes, so we had to experiment with different filming angles and rigging setups. After some testing, we settled on three main configurations and rotated between them along the way.



The first one was fixing the Osmo 360 on a handlebar by using the DJI’s Motorcycle Heavy-Duty Mount. This setup captured all the landscape shots during the ride and turned out to be the best angle for filming my teammates from behind. Apart from that, it was the easiest camera position for deep talks or angry rants during the most challenging parts of our trip. (Or at least, that’s how I used it.)



This rig looked like this. Image credit: Imanuel Thallinger



To capture me (or other riders) in action in a wide view from above, we built a seatpost rig, using the Heavy-Duty Clamp, a selfie stick, and some lashing straps to increase stability. I do NOT recommend this setup. First off, the invisible selfie stick that we had with us is not made for environments with strong vibrations (and DJI officially recommends it for handheld use only), so it wobbled a lot. (According to DJI, for vehicle-mounted applications, you should use a High-Strength Carbon Fiber Invisible Selfie Stick.) Secondly, after some time on a bumpy road, the clamp tended to loosen, causing the stick and camera to tilt until they nearly touched the ground. Lastly, DJI offers a dedicated Bike Rear Mount Kit, which we couldn’t get before the trip, but would have been a better way to mount the Osmo 360 under the saddle.



Image credit: Imanuel Thallinger









Point-of-view shots



Our third go-to set-up was the chest rig for downhill sequences or other point-of-view shots. To me, the flexible Hanging Neck Mount felt very comfy. My husband would disagree. So, I guess, you should try it on and move around in it before taking it on a long ride. On this rig, we always used single-lens mode because that was the only option that made sense. In retrospect, I wish I had had more reframing flexibility. Yet, generally, it’s a great angle to create a visceral, immersive feeling. See for yourself:









User-friendly interface



As you are aware, the primary advantage of a 360-degree camera is that it captures everything around it. Meaning: You don’t have to point the cam in the correct direction straight away, and can reframe and export as many clips as you need afterwards. Normally, I’m a fan of intentional filmmaking, planning shots in advance, and framing each composition carefully, so I don’t have much experience with panoramic recording.



However, for this particular kind of documentary, it was a good choice. When so much action happens unexpectedly, you don’t always have time to set up a beautiful frame. Also, it works well in terms of editing. Just click the record button, and you have several angles to cut to, captured in one moment. Here, for example: 









Keeping in mind that I didn’t have much prior experience with 360 cameras, I must say that the Osmo 360 felt super user-friendly and easy to understand and explore. Its interface is self-explanatory, and the settings are flexible. If you forget which button to press to change the recording mode, there is a corresponding icon on the touchscreen that triggers the same function. I did watch a tutorial prior to our journey, but I would have understood how to set it up even without.



Making a time-lapse. Image credit: Mascha Deikova / CineD



First impressions from the Osmo 360



My impression of the DJI Osmo 360 is that it’s remarkably robust. It handled every kind of weather, from freezing cold to scorching sun, as well as heavy dust and a few drops to the ground, all with only minor scratches on the coating. Both lenses look like new after the journey, even though, as I mentioned, we rarely used lens protectors.



On the other hand, I often wished for a longer battery life. It did hold up to the standards promised by DJI (“up to 100 minutes of runtime in 8K 30fps”), but it was not always sufficient in our case. Notably, this was particularly evident during time-lapses. I would set one up to run for roughly one hour, and the battery would be almost empty afterward. When your only energy source is a small, foldable solar panel, you will definitely think twice before switching the camera on, all things considered.



Go-to modes



One of the practical features of the DJI Osmo 360 is that it allows users to set up custom modes and switch between them whenever they like. Personalization is key when you want to film something specific, and it definitely saves a lot of time.



In our case, we filmed everything in D-Log M to allow for further grading flexibility, and went for 25 fps (to match recorded clips with other footage down the editing line). Mostly, I would switch between:




8K 360 mode, 25fps – for riding and action sequences;



single-lens 5K mode (16:9), set to super-wide field of view, 25fps – for POVs, nature, and static shots;



and time-lapse in 360 mode, also at 25fps.




Initially, I also manually set the shutter speed to 1/50 and defined the ISO limits for correct exposure, but I quickly changed these to automatic. As weather conditions were unpredictable, we were always on the move, and I often had to hit record spontaneously; it was a simpler solution than diving into precise settings each time. However, if you were to use the DJI Osmo 360 for more accurate work, you’d be able to manually set everything you need.



DJI Osmo 360: action-ready



Among other things I enjoyed in Osmo 360 is how quickly it reacts and jumps into action. My feeling: It takes a mere second from being completely switched off to shooting, provided you don’t need to change the mode. Also, when it’s shut down, you can push the record button, and a micro-moment later, it’s already up and running. This felt good because we encountered so many magical moments, wild animals, and unexpected scenes, and were able to capture them without hesitation.



Another feature I used often was voice control. When the camera was fixed in one position on the bike and I didn’t plan to move it, voice commands were very convenient. However, they didn’t always work reliably. It might have been my accent, as I’m not a native speaker, or perhaps it’s something that could be refined in a future firmware update.









Further features



What else? In my opinion, the Osmo 360’s stabilization and horizon locking systems work very well. Sure, in 360 mode, you can stabilize the footage however you like in post, but I also didn’t have any issues in single-lens mode with this. 









The DJI Osmo 360 is waterproof and can be used underwater (according to DJI, at a depth of up to 10 meters). We didn’t dive that deep, but we were still able to capture some shots during the river crossings. However, it is not recommended to film 360 in these kinds of scenarios without special housing; otherwise, the underwater light refraction might cause image distortion and stitching errors. Furthermore, some test reports warn against using this cam underwater without the case altogether, as the lenses can get condensation inside. Well, we did. This is what a clip shot from a single lens without a cage looks like. (By no means ideal and clean, but will work for a quick transition):









The Osmo 360’s dynamic range and video quality



Okay, that’s not a lab test! Thus, we won’t be able to scientifically either verify or disprove the said 13.5 stops of dynamic range. What we can do, however, is show you the resulting clips, shot by the DJI Osmo 360 in different lighting scenarios, and let you judge.



Heads-up! All the shots below use the automatic “Color Recovery” feature in DJI Studio. I applied it to D-Log M during post-production to convert it to Rec. 709.Here’s a video from the sunset swim at the Son-Kul lake:









The next shot was taken around midday in the shade, as we tried to escape the dazzling sun.









And here’s an interior shot from the airport, with a keyframe animation to create movement:









What do you think? To me, it looks decent. Maybe I just have very low expectations when it comes to 360 cameras, but this is definitely not the worst dynamic range I’ve seen so far.



D-Log M 10-bit flat color recording



As already mentioned, we recorded in D-Log M 10-bit color profile whenever it was possible. (It isn’t on a Super Night mode, or for time-lapses, for example). For us, it was important to preserve the highest possible flexibility for grading, as the documentary will also include other footage from different sources. All the shots shown so far were exported using the automatic Color Recovery tool in DJI Studio. Now let’s see how much further they can be taken with professional grading in DaVinci Resolve.



Here’s the same lake shot, exported in flat D-Log M profile. (Unfortunately, not as ProRes – I’ll talk about it down below):









And here’s how it looks after the first grading tests:









You’ve already seen the export of this shot with an automatically applied Rec709 LUT above, so it’s easy to compare them.



In the next example, I’ve cropped in on the 360 footage (which visibly degraded the quality), added some keyframes for slight motion, and exported the result applying the Color Recovery and automatic Noise Reduction features in DJI Studio:









Now, the same shot, but exported in D-Log M, and then graded in DaVinci Resolve:









I guess a professional eye will always notice the noise, blur, and other artifacts. Still, that’s fine. These shots will work for our documentary and can even hold up on a cinema screen. After all, when the story is engaging, most viewers won’t dwell on minor quality issues.



Super Night mode on the Osmo 360



The next clip is shot using the Super Night mode, a preset provided in the Osmo 360 out of the box.









Due to its larger sensor, DJI claims the Osmo 360 has pretty nice “night vision,” capturing a higher amount of light. Naturally, it applies to the normal modes as well, so, in my opinion, you don’t always need to switch to Super Night. Super Night is a preset that mainly helps to minimize the amount of noise. Its disadvantage, though, is that it doesn’t allow the use of a D-Log M color mode, which we discussed earlier.



Built-in audio tested



The Osmo 360 has the OsmoAudio Direct Microphone Connection. This means you can connect the cam directly to up to two DJI microphone transmitters and use it as a receiver. So, if you already work within the DJI ecosystem and have a DJI Mic 2 or Mic Mini, good for you!



As we tried to be frugal with the items we took with us during the trip, we decided to rely on Osmo’s internal 360-degree audio recording. Here’s how the speech sounds with a river in the background and without any enhancement or post-processing whatsoever:









In a second clip, my teammates walk further away from the camera and talk to each other in the city of Bishkek. (Please, ignore the glitch in the video – I tried the “Intelligent Tracking” feature, and it didn’t work properly here.)









That’s impressive, considering the city noise and the distance from the camera – we could still clearly hear their dialogue. For our documentary-style recordings and live sound, the Osmo’s internal mic was sufficient. For interviews or more focused conversations, though, I’d definitely choose DJI’s lavalier mics.



Time-lapses, made by the DJI Osmo 360



Just a few notes on time-lapses. What definitely lacks, in my opinion, is the ability to record time-lapse footage in single-lens mode. For some reason, the cam doesn’t offer this option (at least, at the moment of writing this review). To me, if you only need one lens and one direction, it feels like wasted space and battery to record it in a 360 mode.









Yet, of course, the possibility of making a time-lapse in 360 can inspire you to try out fun ideas, so I would definitely keep it as an option.









For those who are fond of hyperlapses, they are also available on the Osmo 360.



Post-production in the DJI Studio



DJI Osmo 360 video files are either saved in MP4 (with HEVC compression) when filmed through a single lens or OSV, which is DJI’s panoramic mode video format. The latter requires post-production with DJI software – either in the DJI Mimo mobile app or DJI Studio on your Mac or PC.  



The smartphone app is very easy to navigate and also allows you to control the camera during the recording. I haven’t used it for post-production much because I prefer to view the clips on a larger screen. So, instead, most of the shots you’ve seen in this article were exported using DJI Studio.




Screenshots from DJI Studio, showing different features. Images credit: Mascha Deikova / CineD








I’m sure, for most use cases, DJI Studio is a perfect fit. It’s quick, simple, and has all the functions one might need for basic edits. Yet, I wasn’t happy with it at all and will try to avoid using it for our documentary post-pro. Here’s why:




Only one timeline (with one video and audio track) per project. Imagine importing 1,000 OSV clips that need to undergo selection and reframing processes before being exported for grading. That would be a cumbersome task to tackle all this in one timeline. Especially taking into account that you can’t set in- and out-points for the export. On top of that, the software lacks a batch export function for professional workflows.



When you drag one clip on the timeline and reframe it, everything works perfectly. However, if you want to split it and then reframe the clip sections differently, the DJI Studio has a weird glitch. (By the way, the same happens when you pull various clips onto one timeline.) All the adjustments that you make to the first clip seem to be completely deleted once you go to the next one. And just like that, you’re back to Step 1. The only workaround I found is to place one keyframe at the beginning of each clip, locking in place the reframing you made. That’s not how editing is supposed to work, though. So, I really hope that this will be fixed in one of the later software updates.



My biggest issue with DJI Studio, however, is that it doesn’t allow users to export shots in high-quality codecs for further post-production, such as ProRes or DNxHR. Yes, you can preserve the D-Log M flat 10-bit color profile, disable automatic Noise Reduction, and set your export bitrate to up to 160 Mbps (click “Custom” for that), but it’s still not the same when it spits out the compressed H. 264 MP4.




The only other option we currently have is a free Adobe Premiere Pro plugin called DJI Reframe. Its advantage is that you can work directly with DJI OSV files on the timeline, allowing you to access the original video quality and use your familiar editing approach (for instance, pancake editing) as you proceed.



However, it is currently only available for Mac users. And to be honest, we’re planning on editing our bikepacking documentary in DaVinci Resolve. So, it would be great if DJI considered further NLE integrations. Otherwise, I’m afraid, we’d have to establish a complicated and painful workflow.



Hands-on with DJI Osmo 360: conclusion



So, here it is, our honest (and thorough) account of using the DJI Osmo 360 on a bikepacking adventure through Kyrgyzstan. I’ve highlighted the features we appreciated and pointed out where there’s still room for improvement.



Overall, we were happy with the camera. It feels like a great choice for our particular type of journey and documentary. Let’s hope that DJI will continue working on the Osmo 360 and, especially, on the DJI Studio software, and also consider other integrational possibilities with the major NLEs.



What about you? Have you already had the opportunity to test this cam in action? What were your impressions so far? Let us know in the comments! Additionally, if you have any remaining questions about our Osmo 360 review, please don’t hesitate to submit them below.



Feature image credit: Imanuel Thallinger.]]></description>
			<pubDate>Mon, 17 Nov 2025 14:59:34 UTC</pubDate>
			</item><item>
			<title>What Are the Top 50 Film Schools in 2025</title>
			<link>http://www.personal-view.com/news/articles/what-are-the-top-50-film-schools-in-2025</link>
			<description><![CDATA[Just this morning, The Wrap released its list of the top 50 film schools in the United States. Now, we&#039;re obviously a website built to help people who have never been to or can&#039;t afford to go to film school. But I think we should also keep track of which schools are considered good, so we&#039;re going to cover the list. Film school is not necessary to break into or succeed in Hollywood, but it is a good place to learn skills you can use on set and basic formatting and storytelling lessons. By the way, we try to make all those things available for free here as well. But what are the best film schools in the United States? Let&#039;s dive in. The Quick Case For and Against Film School For decades, the debate has been the same: Do you need to spend $250,000+ for a piece of paper that says you can make movies?The answer is &quot;No,&quot; but sometimes school gives people the structure they need to succeed. But let&#039;s be totally real: these top-tier schools (NYU, USC, AFI) are less about the classes and more about the two things you can&#039;t get from a YouTube tutorial:Access: Priceless, high-end equipment and sound stages.Network: A built-in, curated network of future collaborators, agents, and studio execs.These things are incredibly valuable! But they come at a cost that can be creatively crippling for years, if not decades, in the form os student loans -- hilariously, the only debt you can&#039;t get rid of if you declare bankruptcy. You can only get rid of it if you die! . For every success story, there are thousands of graduates working three jobs just to service their student loan debt, their RED camera package gathering dust.So look, this list is a great tool for publications and for schools to use in their marketing brochures, but your success won&#039;t be defined by which school you went to, but by the work you create and the community you build.The Best Film Schools in AmericaI took the liberty of using The Wrap list and then cutting it down for brevity&#039;s sake. Check it out below. 50. San Francisco State University (San Francisco, CA): Champions &quot;independent voices&quot; and cross-disciplinary flexibility. Cons: High 31:1 student-faculty ratio and a low 27% graduation rate.49. Rhode Island School of Design (Providence, RI): Strength in animation, with a new &quot;Movement Lab.&quot; Alum RaMell Ross (&#039;14) was an Oscar nominee.48. Belmont University (Nashville, TN): New to the list after a $58M gift. Building top-tier facilities (Dolby Atmos) and leveraging its Nashville location.47. University of Pennsylvania (Philadelphia, PA): Ivy League program with an intellectual/academic focus. Strong alumni (Dick Wolf) and access to Wharton.46. University of California, Berkeley (Berkeley, CA): Top public university and a great value for CA residents ($17K tuition). Popular for transfer students.45. Penn State University (State College, PA): New to the list. Offers a &quot;small-school feel&quot; (150 majors) with &quot;big-school resources&quot; (new soundstage, labs).44. Morehouse College (Atlanta, GA): Top HBCU with a tiny program (35 majors), strong mentoring, and a mid-90s graduation rate.43. American University (Washington, DC): &quot;Blended&quot; approach with a strong focus on nonfiction filmmaking due to its D.C. location.42. The Los Angeles Film School (Los Angeles, CA): For-profit school in Hollywood with a hands-on focus. Cons: 70% of students are online, leading to high ratios and low graduation rates.41. Stanford University (Palo Alto, CA): A very specific, fully-funded graduate program that only teaches documentary filmmaking.40. University of Georgia (Athens, GA): &quot;Boutique&quot; grad program (36 students) with unique courses (crowdfunding) and TV writers&#039; room workshops.39. Full Sail University (Winter Park, FL): For-profit school whose &quot;secret weapon&quot; is its massive 800K+ sq. ft. of facilities, including a backlot and virtual production stage.38. Wesleyan University (Middletown, CT): A liberal arts program focused on &quot;cinematic vocabulary&quot; over pre-professional skills. Impressive alumni (Michael Bay, Mike White).37. Rutgers University (New Brunswick, NJ): Small program near new Netflix/Lionsgate studios, highlighted by its Documentary Film Lab led by an Oscar-winner.36. Ithaca College (Ithaca, NY): Features an L.A. program and student-run TV channel. Notable: Alum Bob Iger. Cons: Lacks key courses in AI and editing.35. Arizona State University (Tempe, AZ): The Sidney Poitier New American Film School focuses on &quot;championing underserved voices.&quot; Large program with facilities in Tempe, Mesa, and L.A.34. Pratt Institute (Brooklyn, NY): Art-and-design school where students make up to 20 films by graduation. Cons: Lower graduation rate (62%).33. Ringling College of Art and Design (Sarasota, FL): Small, undergrad-only program where students work on ~60 films. Access to cutting-edge animation/VR tech. Cons: No L.A. program.32. Rochester Institute of Technology (RIT) (Rochester, NY): Fuses tech and creativity (&quot;maker&quot; mentality) with a 52,000-sq-ft &quot;MAGIC&quot; complex.31. Drexel University (Philadelphia, PA): Puts &quot;cameras in students&#039; hands from Day 1&quot; with a strong co-op program (Amazon, Lionsgate) and L.A. semester.30. California State University, Northridge (CSUN) (Northridge, CA): &quot;The people&#039;s film school.&quot; Offers high value and leverages its diversity (majority Latino) for industry access.29. School of Visual Arts (SVA) (New York, NY): For-profit school focused on &quot;working filmmakers,&quot; with post-grad festival support and a New York Film Festival partnership.28. Hofstra University (Hempstead, NY): Undergrad-focused program where students make films from year one. Notable: Alum Francis Ford Coppola.27. Southern Methodist University (SMU) (Dallas, TX): Tiny undergrad program (25 majors) where students produce a full-length feature film every two years.26. University of Colorado at Boulder (Boulder, CO): Keeps analog film alive, shooting on Super 8, 16mm, and 35mm. Prepping for Sundance&#039;s 2027 arrival.25. Stony Brook University (Stony Brook and New York, NY): A &quot;hidden gem&quot; MFA program in Manhattan with faculty like producer Christine Vachon and 100% grad student retention.24. New York Film Academy (NYFA) (New York, NY): For-profit school with global campuses and a practical, hands-on focus. Cons: Middling retention (78%) and graduation (62%).23. ArtCenter College of Design (Pasadena, CA): Tiny program (120 students) with an excellent 4:1 student-faculty ratio and an 80% hands-on learning approach.22. Biola University (La Mirada, CA): Christian school with a &quot;four-screen&quot; focus (film, TV, computer, mobile) and a new AI lab. A 55,000-sq-ft facility opens in 2026.21. DePaul University (Chicago, IL): Uses Cinespace soundstages (near The Bear), offers an Alumni Incubator Fund, and provides full-ride MFA scholarships.20. Johns Hopkins University (Baltimore, MD): &quot;Boutique program&quot; that pairs classes with crew work on real-world doc/corporate shoots. Meets 100% of financial need.19. Syracuse University (Syracuse, NY): Offers multiple degree paths (BFA/MFA, BS/MA), a &quot;Newhouse Startup Garage,&quot; and the Dick Clark L.A. Program. Notable: Alum Aaron Sorkin.18. Columbia College Chicago (Chicago, IL): Undergrad-focused school with a new virtual production wall, AI classes, and L.A. semester. Cons: Low 44% graduation rate.17. Florida State University (FSU) (Tallahassee, FL): Very affordable ($7K in-state) with 100% retention. A new MFA program allows students to pitch for a fully-funded feature. Notable: Alum Barry Jenkins.16. Northwestern University (Evanston, IL): Super-selective (7% acceptance) liberal arts approach. Offers a management certificate with the Kellogg School of Management.15. University of Miami (Coral Gables, FL): Updated labs and new partnerships with the medical school for VR content. High 90% retention rate.14. Boston University (Boston, MA): Opened a new $3.5M producing facility and offers $100K in student film grants. Cons: Pricey ($67K tuition).13. Savannah College of Art and Design (SCAD) (Savannah, GA): Boasts the largest university film studio complex in the U.S. (Savannah + Atlanta) and an in-house casting office.12. California Institute of the Arts (CalArts) (Santa Clarita, CA): World-renowned for its animation program (Pixar/Disney alums). Received a grant for a new Chanel Center for Artists and Technology.11. University of Arizona (Tucson, AZ): Hispanic-Serving Institution with low in-state tuition ($15K) and high financial aid (98%). Cons: Shaky 49% graduation rate.10. Emerson College (Boston, MA): VMA department was just elevated to its own School. Offers a BFA in Comedic Arts and an L.A. campus. Notable: Alums &quot;Daniels&quot; (Kwan &amp; Scheinert).9. University of North Carolina School of the Arts (UNCSA) (Winston-Salem, NC): A stand-alone arts conservancy. A bargain for NC residents ($6.5K) with a low 8:1 student-faculty ratio.8. The University of Texas at Austin (UT Austin) (Austin, TX): Home to Matthew McConaughey’s &quot;Script to Screen&quot; class and the world-class Harry Ransom archives.7. Columbia University (New York, NY): The only Ivy with an elite, grad-only film program. &quot;The story school&quot; features an excellent 3:1 student-faculty ratio.6. University of California, Los Angeles (UCLA) (Los Angeles, CA): Fiercely competitive (1% undergrad acceptance) but affordable ($19K in-state). Features a low 3:1 student-faculty ratio.5. Loyola Marymount University (LMU) (Los Angeles, CA): Launched new AI-focused courses (&quot;Producing with AI,&quot; &quot;Law and AI&quot;) and a film festival for AI-integrated work.4. Chapman University (Orange, CA): Integrating AI and virtual production schoolwide, breaking ground on a $5M Innovation Hub, and has doubled student film support.3. AFI Conservatory (Los Angeles, CA): Elite, grad-only conservatory with a &quot;learn by doing&quot; mantra. New AI workshops funded by an Amazon grant. Notable: Inaugural class included Malick and Lynch.2. University of Southern California (USC) (Los Angeles, CA): A massive alumni network and a new $25M virtual production center. Boasts a 100% retention rate.1. New York University (NYU) (New York, NY): Top-ranked. Opened a state-of-the-art virtual production center (named for Scorsese, funded by Lucas) and promises zero tuition for families earning under $100K.Summing It All Up Again, these films will not make or break your career, but if you  &#039;re hellbent on getting a degree, these are the places that are worthwhile to look at and to check out. And if there&#039;s stuff you want to learn, reach out to us and we&#039;ll try to write more articles on it now. Let me know what you think in the comments. ]]></description>
			<pubDate>Fri, 31 Oct 2025 23:33:13 UTC</pubDate>
			</item><item>
			<title>What is This New TX Mode Now Available for Your CineView M7 Series Production Monitors?</title>
			<link>http://www.personal-view.com/news/articles/what-is-this-new-tx-mode-now-available-for-your-cineview-m7-series-production-monitors</link>
			<description><![CDATA[Available via a new firmware update for the Accsoon CineView M7 Pro, a new TX transmitter mode is here, along with several new features coming to the CineView M7 series. This is exciting news for production crews, both big and small, as it will enable wireless video transmission using the M7 Pro’s built-in Dual-Band wireless video module for seamless transmission to your CineView Master 4K receivers, M7 Pros in RX mode, and other smart devices monitoring with the Accsoon SEE app.Let’s go over all of the new features as well as explore how you can get up and running with this new TX transmitter mode on your CineView M7 Pro production monitor here today.Accsoon CineView M7 Pro Firmware UpdateAvailable on Accsoon’s website, the latest firmware package brings not just this specific new TX mode feature, but several other updates and new features to the CineView M7 series. Here’s a full list of new features:
TX Mode Support for CineView M7 ProWireless Camera Control SupportVertical Anamorphic SupportAdvanced Blanking SupportUnclipped Zoom-in SupportCustom Background Image SupportYou can download the new firmware on Accsoon’s website here, or check out all of the details in this Release Note here.Enabling TX ModeAs for enabling the new TX mode with the CineView M7 Pro specifically, Accsoon has released another detailed tutorial that guides you through the process of enabling TX transmitter mode on CineView M7 Pro.
The video showcases the workflow for how TX mode enables wireless video transmission utilizing M7 Pro&#039;s built-in Dual-Band wireless video module for seamless transmission to CineView Master 4K receivers, M7 Pros in RX mode, and/or other smart devices monitoring with Accsoon SEE App.
You can find more info on this new feature and all the rest of the firmware updates on the same links above.Accsoon CineView M7 Pro 7&quot; Recording Monitor with Wireless Transmitter/ReceiverAccsoon CineView M7 Pro 7&quot; recording monitor with wireless transmitter/receiver]]></description>
			<pubDate>Tue, 21 Oct 2025 17:47:36 UTC</pubDate>
			</item><item>
			<title>Accsoon CineView M7 Pro Firmware Update Adds TX Mode, and Wireless Camera Control, Major Monitoring Upgrades for Both Models</title>
			<link>http://www.personal-view.com/news/articles/accsoon-cineview-m7-pro-firmware-update-adds-tx-mode-and-wireless-camera-control-major-monitoring-upgrades-for-both-models</link>
			<description><![CDATA[Accsoon has released a significant firmware update for its CineView M7 and M7 Pro smart monitors, introducing a new transmission (TX) mode (Pro model only), enhanced wireless camera control, and a host of on-set monitoring refinements.



The update, available now for both the CineView M7 and CineView M7 Pro, expands the functionality of Accsoon’s growing CineView ecosystem by turning the M7 Pro into a powerful transmitter and improving integration with mirrorless and cinema cameras.



The firmware also adds new anamorphic options, refined blanking tools, and other quality-of-life improvements aimed at camera crews and DITs who rely on wireless monitoring systems in fast-paced production environments.









Turning the CineView M7 Pro into a transmitter



The key feature of the new firmware is the addition of TX Mode, which transforms the CineView M7 Pro into a transmitter. Accessible from the Mode settings menu, users can reboot the unit into a transmission configuration and choose between H.265 or H.264, depending on the capabilities of connected receivers.



In this mode, the M7 Pro can transmit to up to four receivers simultaneously, including other M7 Pro units in RX mode, CineView Master 4K receivers, and mobile devices running the Accsoon SEE app. H.264 must be used when transmitting to iOS devices, while both H.264 and H.265 are supported by the dedicated receivers.



This new functionality effectively makes the CineView M7 Pro a flexible node within Accsoon’s wireless video ecosystem, enabling mixed setups of monitors, receivers, and smart devices across a production.



Accsoon CineView M7 Pro mounted on a camera. Image credit: Accsoon



Smarter wireless camera control for M7 and M7 Pro



Accsoon has also upgraded its wireless camera control system. Users can now connect to supported cameras via Wi-Fi, either through the camera’s own hotspot or an on-set network.



The new firmware adds support for the Sony FX3 and FX30, with FX6 compatibility currently in beta (validated with camera firmware v3.0).



A new Camera Control Method switch lets operators select between “USB” or “IP” connections manually. When using IP control, credentials can be entered directly within the new Remote Connection Panel, and for cameras like the FX3 and FX30, the app can now automatically detect and connect to camera hotspots.



Supported functions include remote record triggering, aperture, shutter speed, ISO or gain control, and white balance adjustments.



Accsoon CineView M7 Pro used as a transmitter (TX) for other receivers. Image credit: Accsoon



Monitoring upgrades and workflow refinements



The firmware also introduces a long list of monitoring and usability improvements for both models:




Marker tool renamed to Blanking: adds a Blanking Safezone function that prevents overlays from the camera UI, with safe zones and guides now scaling dynamically.



Vertical anamorphic de-squeeze: supports vertical-format anamorphic shooting with automatic correction.



Unclipped zoom-in: the entire live-view canvas scales smoothly when zooming, even with anamorphic de-squeeze enabled.



Custom SEE app background: users can now upload their own 16:9 image for branding client monitors.



Updated aspect ratios: new common delivery presets and support for custom ratios to two decimal places.




Other improvements include faster startup, improved stability and latency, refined UI elements, enhanced accuracy for monitoring tools, and better audio decoding and reconnection behavior on the M7 Pro.



Check out our NAB interview with Accsoon about the monitors in case you missed it:









Availability of the CineView M7 Firmware Update



The firmware update for the Accsoon CineView M7 and M7 Pro is available now from the Accsoon Download Center. Tutorial videos explaining TX Mode and the new camera control features are also available on the company’s official channels.



With the addition of TX mode, smarter camera control, and refined monitoring tools, the CineView M7 Pro becomes even more versatile for professional sets. Have you already tried the new CineView M7 firmware update  in your workflow? Share your experience in the comments below.]]></description>
			<pubDate>Tue, 21 Oct 2025 14:14:03 UTC</pubDate>
			</item></channel> 
	                </rss>