cameras that record in cinema DNG produce files with bitrates 6-20 times the max DCP standard used in cinemas. However real-time playback of such files requires computers with specs quite a ways beyond typical cinema server specs, and beyond what is affordable for most humans at the moment. Most people don't buy higher-TDP laptops nowadays, either, and few buy laptops with quadros and radeon pros which would be needed for 10-bit color playback or theoretical RGBCYM colorspaces fitting within those 30 bits (maybe there is extra room in the alpha channel(s); am not expert in this).
while i certainly don't mind producing great footage and films for a small number of computers and downressing for more-mainstream devices, i feel another solution exists.
if we sequence the pixel values of completely-uncompressed frames such that we can copy straight to the framebuffer, wouldn't we only be limited by the SSD read speeds and interface (including interface traffic on mobo controllers)? I have thought about loading video clips into RAM for ultra-high-bitrate video but that feels like more of a (potentially useful) magic trick rather than something pragmatic for most film/video production and distribution. The financial costs of distributing films/videos via SSD are not trivial, yet I feel this leap in image fidelity may be important for us getting naturalistic art noticed amidst a sea of commercial films. I say this because great films do appear periodically and don't typically penetrate mainstream consciousness, despite their virtues and transcendental accomplishments. They certainly have profound ripple-effects, yet I feel it might be good to have some forthcoming films make obvious and pronounced waves partially by leapfrogging all the big-budget stuff on a technical level (technical in terms of audiovisual fidelity, anyway).
are there resources to consult to program something like this? i tend toward C++. alternatively, is anyone here interested in designing a codec or codec set based on the principle of minimal-to-no-decoding? a tool for transcoding and debayering (ideally for many display subpixel layout types) from cinema DNG would be most helpful for low-budget filmmakers, with transcoding from the open arri raw format being potentially desirable for ~600 MBps alexa 65 footage (i don't recall the precise bitrate). personally i don't feel i will often film beyond the 272 MBps of the blackmagic pocket cinema camera 4k, but perhaps we and/or others will imagine cases that call for an alexa 65 or future cameras from other manufacturers. one of the out-of-production kinefinity cameras had a bitrate in the 300s (i think it was the first-gen Terra) but it hardly seems worth tracking down for a modest bump in bitrate; for some that camera is perhaps a dream camera though and it is worth looking into. one of the DJI cameras records a similar bitrate yet i don't know much about filming with it and whether it has a practical form factor when not mounted to a drone. anyway BMPCC cameras have been $1k and a bit more kitted with good memory cards and batteries and such, and the first BMPCC once saled for $500. we can get a lot of amazing footage from people who would not have been able to afford cinematography in the past. even iPods film above the 30 MBps DCP standard so we can totally start projects that will have a baseline fidelity above that of other contemporaneous cinema.
thank you for reading and for your interest and consideration
>cinema DNG produce files with bitrates 6-20 times the max DCP
You can't compare both. One is demosaiced, the other is not.
>beyond typical cinema server specs
Not true. You can do demosaicing in realtime using simple algorithms (such as bilinear) even on low-spec CPU. Realtime playback of Raw footage with high quality demosaicing like AMaZe requires more powerful hardware, but it is possible too.
>fitting within those 30 bits
This doesn't makes sense. Cameras usually record with maximum of 16-bit. More than that is only used for 3d graphics (under OpenEXR lossless compression) or for HDR in photography (such as using HDRMerge).
>copy straight to the framebuffer
Won't work because of two points:
1 - The Raw footage needs debayer first and you woudn't distribute files as Raw, because you need to color grade those first. Most of them are in logarithimic spaces and needs to be converted to Rec.709 or Rec.2020 first. You do that on post-production, after debayering.
2 - Even if you distribute in uncompressed YUV files, these would have extremely huge file size. If you try to copy directly to framebuffer, your buss bandwidth wouldn't be enough to pass all this information realtime. Also, this is very ineficient because you can do lossless compression (with Cineform or Lagarith) or near-lossless (with ProRes or DNxHR).
>only be limited by the SSD
No, you would be limited by the PCI speed (about ~6 GB/s). But this wouldn't be enough.
>I have thought about loading video clips into RAM
You can do that using ffmpeg. Transcode any file to YUV using ffmpeg and then reproduce using mpv with this option:
$ mpv --vo=drm --hw-dec=crystalhd file.yuv
>I feel this leap in image fidelity may be important for us getting naturalistic art
Video compression is not the variable limiting video quality. Codecs such as VP9, HEVC and even the new AV1 is already very good at compressing. Using 50GB of bluray you can put a 2 hour movie without noticiable artifacts (for most people, at least, not professionals). To be fair cinema industry don't know yet what we can do to make it even better. Displays need to support bigger colorspaces (hence the new Rec.2020 standard) and more bit-depth (hence 10-12bit HDR), but from the camera and post-production perspective, there's not much space to improve. One leap cinema did these last years was the introduction of ACES color management and large-format cameras (such as ALEXA LF and Panavision DXL2).
>are there resources to consult to program something like this?
No need, your idea is already implemented, it's just not used because it is innefective. If you want compression resources, see xiph.org and the encode.ru forum.
>principle of minimal-to-no-decoding
You can't do that efficiently. If you want something efficient, see AV1 codec:
https://en.wikipedia.org/wiki/AV1
>would be most helpful for low-budget filmmakers
Nope. The most helpful would be to make low-budget cameras actual record Raw footage, such as BMPCC and Red.
>iPods film above the 30 MBps DCP
You must be trolling here.
>true that one is demosaiced and the other isn't, however if we have various subpixel layout matrices for real-time debayering then the raw file has a higher fidelity as it hasn't been first transcoded to a standard format with the assumption that different displays and stuff like diagonal-pixel projectors will resolve the colors of that microregion of the camera sensor at the pixel/subpixel level
>Fast CinemaDNG Processor's recommended specs are an Intel Core i7 5930 or Xeon with 6 cores or more and a Quadro M5000 (if 10-bit color in open GL is needed). They don't specify if this is for the lower bitrate 1080p 60fps BMPCC CinemaDNG footage or for cinema dng raw footage from newer cameras, nor if they work for the 260-272 MBps files from the BMPCC 4k. You may be totally right about not needing high specs for playback, i just don't have a computer, this program, and files to test with at the moment.
>10-bits for R, G, and B each is 30 bits total. I don't know enough of the engineering sciences here to say whether subdividing those for a 5-bit per channel RGBCYM color space is feasible but that is something we've pondered about.
>copy straight to the framebuffer
>i assume the raw files have some level of decoding necessary because the bitrates are below what 1920x1080 pixels x 36 bits per pixel (with 12-bit color) totals in MB. this would be an intermediary codec that has been losslessly decoded and either pre-debayered for a specific projector model, monitor type, tablet type, mobile type, et cetera with different debayering algos specific to each subpixel arrangement
file size would not be too unreasonable. direct 1080p 60fps 12-bit color readout which can be quadrupled with almost no computation overhead for 4k displays would be 534 MBps. That's 31.3 GB per minute. 130 minutes would fit on a 4 TB SSD. For longer films a dual-bay or quad-bay SSD icedock would suffice, can hotswap with limited bays if necessary rather than spring for CPUs and Mobos with surplus sata lanes and no pci lane conflicts.
anyway PCI speed is more than enough to cover the fidelity coming out of these cameras.
>I have thought about loading video clips into RAM
You can do that using ffmpeg. Transcode any file to YUV using ffmpeg and then reproduce using mpv with this option:
$ mpv --vo=drm --hw-dec=crystalhd file.yuv
Thanks for the great tip! ^_^
>Video compression is not the variable limiting video quality.
To some extent I agree yet I also disagree. Improper camera exposure settings for the scene, excessive post-processing, excessive transcoding and resolution changes during post-processing, and iffy display tech are all affecting image quality. However I feel footage handled in the way described above with no post-processing or minimal color-grading (which could be done real-time for adjustments for specific projectors and screens on cinema servers if specific theatres wished to invest in $5,000+ servers) would yield a pronounced leap in fidelity and feel transportive, almost portal-like. It is speculative yet i am trying to test it this year or next and can report back the results.
>The most helpful would be to make low-budget cameras actual record Raw footage
Agreed, yet looking at the bitrates for the given sensor resolutions, no camera, neither Arri, Sony, Panavision, Red, Canon, Blackmagic, Kinefinity, Ikonoscop, DJI, Digital Bolex, et cetera are recording bit-perfect streams from the subpixels. All the raw codecs are marketing. Personally I feel all the cheaper ones recording Cinema DNG (not the newer blackmagic cinema dng raw 3:1 and 4:1 and such) are more than good enough to leapfrog the big-budget film production companies with.
>iPods film above the 30 MBps DCP
It's true. Raw! Video Pro Film Camera records 1080p MJPEG at 300 Mbps / 38 MBps. If the exposure is good in-camera then it might look higher-fidelity than anything currently being projected in cinemas. i have a 128gb iPod touch 6th gen that i will be testing with this week. unsure if ipod plays it back without stuttering, i don't have a modern laptop or desktop to test playback with if that is the case, but will try to test this soon and can let you know if it works well. if it is dropping frames i will save up and buy a 256 gb iPod touch 7th gen which can store like 90 minutes of footage with like 50 GB of space still free for apps, music, photos, lower-bitrate videos, games, et cetera.
Of course, manual focusing, lens selection, and dynamic aperture control are all a bit problematic, but for some stuff i feel it is a fine camera and for very fast, agile movement it is maybe the best. feels good in hand and in an armstrap which can be mounted elsewhere on the body or on various sport harnesses. Am eager to be filming with BMPCCs and BMMCCs yet only have ipods with me at the moment; if the results are good i feel they will be very nice options especially for people from low-financial-income countries-- assuming editing the footage and distributing it will not require too much additional tech and funding. am unaware of any mobile apps that export VBR at original bitrate or have something like 38 MBps MJPEG export or 30 MBps DCP export. Will try DCP-o-matic on an atom processor soon.
ultrakam for ios films at ~50 MBps. some say it is stable and doesn't drop frames, others report it crashing. i just downloaded it on ipod touch 6th generation and i have almost all the background stuff iOS does by default off in the settings ; it is on the factory iOS 10.0.1 and i don't think i install sketchy apps. will test it soon and post results with the other tests within a month or two
>>4839 I think you are a bot, but I'll reply anyway because other people might read and find interesting:
>no post-processing or minimal color-grading (which could be done real-time
You can't do that. To be able to color grade you need to debayer first. Also, color grading requires precision, you can't do that in realtime. Also, you wouldn't be able to make visual effects (CG).
>would yield a pronounced leap in fidelity
It wouldn't. You don't understand the concepts you're talking about. You can achieve the same "fidelity" using a lossy codec like ProRes, yet with much smaller file sizes.
>no camera [...] are recording bit-perfect streams from the subpixels.
What do you even mean by "bit-perfect"? Please go read how a Bayer sensor works.
>All the raw codecs are marketing.
No, Raw codecs is just a lossless compression algorithm optimized for bayer data.
>good enough to leapfrog the big-budget film production companies with.
It is not because of the sensor size. This means current big-budget lenses don't work in them, since more of these are for full-frame (35mm) or large-format (65mm). BMPCC is a nice camera, but you can't compare it with something like ALEXA LF.
>Raw! Video Pro Film Camera records 1080p MJPEG
MJPEG is not Raw. Please, again, go read about what a Bayer sensor and lossless compression means.
>color grading and debayering can be done together. a LUT isn't going to losslessly grade 36 bit RGB though. the goal is to get the image exposure and colors right in camera as often as possible, and only color grade in ways that mathematically preserves the original sensor data insofar as that is possible. real-time color adjustments would be to modify the image for specific projectors and screens at commercial theatres and establishments that can afford high-end computers.
>i don't agree that pro-res maintains that fidelity. maybe on a small screen, not on a large FOV projector or VR display. i sit up close at theatres sometimes to assess the image fidelity and walk up close to 4K TV demoreels and such. the image breaks down way before the pixels grid becomes visible.
>bayer sensor is 2/3 the marketed resolution. for '1080p' 60 fps 12-bit an uncompressed 60 frames would be 534 MB. 2/3 of that is 356 MB. BMPCC films at like 180 MBps, give or take a few MBps. All the professional cinema cameras are like this, none of them that i am aware of record full bit-perfect readouts of the marketed specs. Not that it matters much, there is so much detail there that the challenge is making compelling films, not so much capturing spatiotemporal resolution. Although that does matter too and it helps certain scenes a lot.
>sensor size doesn't matter that much if we can't record the full subpixel readout anyway. the big-budget lenses are mostly a marketing scheme to exploit the wealthy film studios to subsidize the optics companies' personnel' personal lives (which they absolutely deserve to enjoy) and pet projects and optics R&D. design goals of those lenses are mostly consistency in housing to work with standardized follow-focus tech. they are also designed for ease of collimation so that they can be collimated for each camera they are used on so that there are not blurred edges and such as often. a tertiary design objective is sturdiness of housing and ease of disassembly and repair since these lenses are rented out and they need to be pristine for each successive client. i don't understand why the film studios obsess over this stuff. optically these lenses don't usually outresolve consumer-grade cinema lenses; sometimes they do but a correct lens selection for the scene and lighting does more than an extra optical flourite element in an improper lens choice for the scene.
i looked at lens resolution tests and theory and data for Super 16 lenses years ago. the professional lenses of that era outresolved the resolution of the current sensors. also modern MFT lenses for photography and cinematography also tend to outresolve the digital cinema camera resolutions. their focus and aperture controls are designed by photography standards usually rather than cinematography, but some lenses like the Voigtlanders have clickless aperture models (settings? i don't have any yet). but an MFT photo lens can autofocus on blackmagic and kinefinity and such, this is kind of better sometimes for action sports where the camera is body-mounted and the focal le ngth and aperture don't offer a lot of distancing options (whereas daytime with a wideangle would be better on an old S16 lens or something). and MFT sensors have higher pixel densities than larger sensors, the lenses are often designed for extreme precision, some of the Panasonic-Leica and Olympus lenses especially.
>MJPEG is not raw but raw is ostensibly not quite raw either. as long as the color science of the app authors is not mathematically aggressive to the image information i don't see why it's not still higher-fidelity than most videos we see. DCPs are often MJPEG or JPEG2000 and they look good enough generally, even though we can go beyond them.
anyway the reason codecs lose image quality is because they presumably rely on creating pallettes and then addressing the palletes for a matrix of pixels that have been binned to that pallette address. but with 12-bit video that is 36 bits RGB which is like 64 billion colors. either the palleting compression and decompression become very computationally intensive or the mathematical rounding to the nearest pallette color becomes too aggressive and lossy. furthermore this thread was initially about the goal to get hi-fi video to more devices that are computationally-limited rather than limited by space as the files would be distributed on SSDs, CFast 2.0, SD, and Micro SD, depending on the platform. Uncompressing the image fully to target resolution or 1/4 or 1/16 target resolution and then reading the info straight to the framebuffer would allow low-spec devices to play hi-fi video and still probably run background apps on multi-core devices
*addendum: unsure if bayer resolution would be 2/3 or 4/9 marketed resolution or what. anyway 720P displays have not been maxed out. 480P 10-bit displays running to 60Hz would have a 65.9 MBps bitrate, over twice that of contemporary cinema distribution. Older 480P displays with 8-bit color running at 60Hz would have a bitrate of 53MBps.
Vimeo's recommended '8k'compression is 1MBps, although I don't know if they havr a maximum bitrate nor whether they auto-transcode to some standard 8k target past a point.
720P 8-bit display running at 60Hz, very popular display type, is maxed out at 158.2 MBps. So I feel Blackmagic is an honest company, marketing the BMPCC as a 1080p camera, because it films around that or slightly above. Calling it a 720P camera would have been a marketing and PR headache while others toss out 4k and 8k into advertising campaigns with either no intellectual honesty or just no clue as to what they are talking about. may as well call the BMPCC 4k a 96k camera if GoPro will call their 90 Mbps camera 4k when the BMPCC 4k can film at 2,176 Mbps. Arri Alexa 65 would be called a 200+k camera. feels like we are along for the ride a bit with mainstream marketing and such, hard to communicate to employers/clients what we can do if they don't study tech as much. nevertheless we perservere
>>4860 >LUT isn't going to losslessly grade 36 bit RGB though
First of all, there's not a single camera producing 36-bit images. This is just nonsense. 24-bit is more than enough to capture the dynamic range.
Second, 3DLUT's are only used in cinema for preview different looks. They are not used for final color grading.
>i don't agree that pro-res maintains that fidelity.
ProRes 4444 will preserve all visual information. This means no artifact will be in the footage, only if you try to push informations, such as contrast/saturation. ProRes is visually lossless.
>bayer sensor is 2/3 the marketed resolution.
Do you even know the history of digital cameras? Please, go read a fucking article/book.
>full bit-perfect readouts
What this even means? Do you mean full color information, like Foveon sensor? If it is, then my answer is: Sigma has patents for the Foveon, no other company can produce it. Recently the university of Zurich announced some research on this area, but we are probably very far from having something like Foveon.
>sensor size doesn't matter that much if we can't record the full subpixel readout anyway.
You're chaining a lot of buzzwords here, without even knowing what they mean.
>the big-budget lenses are mostly a marketing scheme
Some features are, some not. The large-format movement is legit. There's a difference on the depth-of-field.
>don't see why it's not still higher-fidelity than most videos we see.
This directly contradicts your first proposal, in the OP. Are you schizo?
>color science of the app authors is not mathematically aggressive
The "app" will compress the color space to Rec.709 or Rec.2020. A true Raw image has no color space and so you can color grade in a much biggest standard, such as ACES.
>straight to the framebuffer would allow low-spec devices to play
It will not, because now you shift from decoding to debayering and high-quality debayering is very complex and intensive.
>Vimeo's recommended '8k'compression is 1MBps
Citation needed. I don't even think Vimeo supports 8k resolutions, only 4k. Also, I'm quite sure the recommended compression is not 1MBps but 40MBps.
I won't reply anymore, because I think you're a chatbot trolling me. If you're not, learn to write correctly and read about the concepts you're talking about before putting yourself as an expert on the area.
>12-bit color is per channel. R +G+B is 36 bits total. Although i realize that if a raw file is storing subpixel bayer values that it is not quite clear what the bit depth is. A 12-bit Foveon image would have 36 bits describing each pixel. i can't say what is necessary to preserve the dynamic range, although 10-bit per channel/30-bit for readout to display seems fine, more for the math of color-grading to not cause shearing.
>ProRes 4444 isn't lossless. Neither is Apple Lossless audio nor Flac. ECC memory is a hardware virus. We can't assume every engineer is at the zenith of wisdom in their field.
>one of the Panavision engineers has a long article on bayer sensors and marketed resolutions. There are also those Sony presentation slides where they rotate the sensor 45" and suddenly claim much higher resolution. Debayering techniques vary wildly. Anyway they typically count the subpixels horizontally rather than clusters of 3 subpixels. Could go into more detail but it is camera-specific and i don't feel like searching for a lot of spec sheets and pdfs at the moment.
>bit perfect readout would be if the green subpixel at (69,420) had a 12-bit color of (1024) that it would be stored at (69,420) in the relevant matrix and when pointed to would read back (1024), and not
([11-bit pallette color address]) or some other similar technique that often gets marketed as lossless, visually lossless, or raw.
>Foveon is great yeah, we plan to do timelapse with it. Been hoping for video on those cameras, maybe it will happen; Sigma doesn't have the R&D resources that others have and so rightfully focused on refining the photography capabilities and desktop apps.
>not buzzwords, been studying this stuff for a long time
>the large formats lenses are quality lenses, absolutely, but the resolving-fidelity-to-price ratio is wacky and makes almost no sense except for film studios whose equipment and personel match what i wrote above about this. The modern MFT mount cinema lenses are a lot more affordable and can generally outresolve these cinema sensors. Depth of field is thinner on larger sensors because the focal plane is rather a focal sphere and intersects the sensor plane tangentially when the center of the image is in focus. Wide aperture lenses can achieve this effect on smaller sensors. Lots of f/0.8-f/1.2 MFT lenses. Adapting Nikkor F and Canon EF lenses via metabones speedboosters yields a lot more options in that f-stop range.
>38 MBps MJPEG probably looks better sometimes than 30MBps MJPEG. Unless the app developer's color science is mathematically more aggressive than someone's raw grade/debayer/transcode. In an industry that spends hundreds of thousands of dollars renting equipment that doesn't outperform equipment sets costing $1,000-10,000, it's plausible that the film graders also don't quite know their field fully. i saw Samsara and Visitors in theatres and Baraka and Visitors on blu-ray. Ron Fricke and Jon Kane and other such artists know their crafts and they eke out higher-fidelity from these filetype limitations. Those Sony 4k demo reels for TVs are only 14-15 mbps and look quite good, am not saying these apps are necessarily producing nicer footage, just that if they tweak the color science and/or add more options there then it should theoretically match or exceed spatiotemporal resolutions seen in cinemas at the moment.
>interesting info about the Rec color spaces. i come from a video game programming background, ASCII with hex values, and some printing with CMYK conversions, literally never thought about Rec color spaces. will study that, thanks. i think our custom codecs and video players could bypass that stuff and just output RGB like game consoles.
>am talking about an intermediary codec for low-spec devices. Cinema DNG playback would just be in high-spec workstations. when i design those on paper they are usually $8,000, sometimes more, although they can be built for less. That is fine for enthusiast cinemas and some homes and culture centers, but another decodeless codec or codec set would allow for playback on personal computing devices and cinema servers that currently can playback 30MBps DCP MJPEG / JPEG2000 etc, which gives us hints as to their specs (which others surely could explain better if they have access to mainstream cinema servers).
sorry, i wasn't thinking straight, juggling a lot. the recommended 8k bitrate for vimeo is 5-8 MBps. That's not even (320 x ) 240p 60fps 8-bit per channel/24-bit RGB (13MBps).
>ad hominem
._. am addressing your points pretty thoroughly. you brought up some important stuff i hadn't considered, yet most of what i wrote aside from the vimeo snafu is accurate. it just takes time to think through, i try to visualize the low-level hardware happenings rather than what marketers and programmers have to say about their stuff, although of course i pay a lot ot attention to that stuff also and learn from them a lot.
expanding on the 36-bit confusion, if these 'raw formats' are missing like 2/3 or more of their marketed resolutions, can we even say they are storing the subpixel values losslessly? do they just discard every X number of subpixels or something, or try some sort of lower-bit-depth pallet binning? if these 'open' codecs were as open as the word suggests the online documentation would not be obtuse, and decoding those files would be as simple as reading the subpixel color value at a given matrix coordinate. if it is not so simple it is because they are actually encoded files.
also a note about the MJPEGs: have graded raw files from an Olympus camera that i was doing ultraviolet photography with. it was very difficult to match the exquisite balance of the JPEGs the camera produced parallel to the raw files. Granted, it's Olympus; personally i consider them #1 in this. But that sort of color science integrated into these ipod apps might yield better results than the MJPEGS and JPEG2000s we normally see in cinemas.
oops, still messed up the vimeo thing. it is 6-10MBps. Better, but when we are talking about cameras with like 169-600+ MBps readouts i tend to lose focus when evaluating the tech specs of streaming sites and such that use these wonky codecs that are computationally laborious to decode. Vimeo 8k is still lower than (320 x ) 240p 60fps 8-bit per channel/24-bit RGB (13MBps). That's 1/432 the marketed resolution. Am trying to help a lot of beings including Nanons and feel there are artistic, communicative, documentative, and financial opportunities here by leapfrogging mainstream video tech. could also help with animations in games and such.
>>4872 I said I wouldn't reply again but:
>not buzzwords, been studying this stuff for a long time
No, either you're not studying or just an idiot that cannot understand what you read.
I've been working with photography/cinema for 8 years now. I can identify when someone is full of shit.
>most of what i wrote aside from the vimeo snafu is accurate.
Nope. It's not. Read again my comments.
>ad hominem
When someone don't listen to arguments, ad hominem is the only possible solution.
sage plants like salvia divinorum teach us that there's a whole lot going on in the microcosmos
it took me years to realize this stuff about video. i pre-ordered a BMPCC and got a used Olympus E-P2 and a Panasonic GX7, meanwhile i was paying attention to what girlfriend was doing who has a masters in photography and was a professional photographer for quite a while. she is a programmer nowadays also. we're in agreement on this stuff having discussed it for the past six years.
in my opinion it's best to refrain from too much speculation until playing back a file in Fast CinemaDNG utilizing a 10-bit monitor, an Eizo or ViewSonic or comparable monitor. Or an Optoma projector or high-end Sony or Christie or something. personally i feel the results would be awe-inspiring and surreal.
anyway i will be around if any programmers would like to collaborate to implement this stuff. will be studying programming so eventually i could perhaps code this stuff but it would happen faster with experienced programmers collaborating. payment would probably be royalties initially plus whatever seems agreeable, i don't have money at the moment but might start earning decent income this year.
-~*
8k 60fps vimeo is 1/569 equivalent monitor bitrate
4k 60fps vimeo is 1/190 equivalent monitor bitrate
4k 60fps youtube is 1/513 equivalent monitor bitrate
1080p 60fps youtube is 1/749 equivalent monitor bitrate
1 extra bit resolution doubles the color precision. for example 256 to 512 to 1024 for each color channel. so sometimes 1 bit is like 512 extra gradations.
perceptible resolution is perhaps not calculable, the linear bitrate math is just to show there is an enormously large amount of info that can be received by screens and projectors. GBA can display 4k youtube and 4k vimeo in bitmap mode if the ROM read speeds and relevant bussing speeds are quick enough. will research that. the extent to which we will code for older consoles and computers is unknown at this point but we are looking back to 90s computers at least, with programs for 1999-2010 devices very likely
cameras that record in cinema DNG produce files with bitrates 6-20 times the max DCP standard used in cinemas. However real-time playback of such files requires computers with specs quite a ways beyond typical cinema server specs, and beyond what is affordable for most humans at the moment. Most people don't buy higher-TDP laptops nowadays, either, and few buy laptops with quadros and radeon pros which would be needed for 10-bit color playback or theoretical RGBCYM colorspaces fitting within those 30 bits (maybe there is extra room in the alpha channel(s); am not expert in this).
while i certainly don't mind producing great footage and films for a small number of computers and downressing for more-mainstream devices, i feel another solution exists.
if we sequence the pixel values of completely-uncompressed frames such that we can copy straight to the framebuffer, wouldn't we only be limited by the SSD read speeds and interface (including interface traffic on mobo controllers)? I have thought about loading video clips into RAM for ultra-high-bitrate video but that feels like more of a (potentially useful) magic trick rather than something pragmatic for most film/video production and distribution. The financial costs of distributing films/videos via SSD are not trivial, yet I feel this leap in image fidelity may be important for us getting naturalistic art noticed amidst a sea of commercial films. I say this because great films do appear periodically and don't typically penetrate mainstream consciousness, despite their virtues and transcendental accomplishments. They certainly have profound ripple-effects, yet I feel it might be good to have some forthcoming films make obvious and pronounced waves partially by leapfrogging all the big-budget stuff on a technical level (technical in terms of audiovisual fidelity, anyway).
are there resources to consult to program something like this? i tend toward C++. alternatively, is anyone here interested in designing a codec or codec set based on the principle of minimal-to-no-decoding? a tool for transcoding and debayering (ideally for many display subpixel layout types) from cinema DNG would be most helpful for low-budget filmmakers, with transcoding from the open arri raw format being potentially desirable for ~600 MBps alexa 65 footage (i don't recall the precise bitrate). personally i don't feel i will often film beyond the 272 MBps of the blackmagic pocket cinema camera 4k, but perhaps we and/or others will imagine cases that call for an alexa 65 or future cameras from other manufacturers. one of the out-of-production kinefinity cameras had a bitrate in the 300s (i think it was the first-gen Terra) but it hardly seems worth tracking down for a modest bump in bitrate; for some that camera is perhaps a dream camera though and it is worth looking into. one of the DJI cameras records a similar bitrate yet i don't know much about filming with it and whether it has a practical form factor when not mounted to a drone. anyway BMPCC cameras have been $1k and a bit more kitted with good memory cards and batteries and such, and the first BMPCC once saled for $500. we can get a lot of amazing footage from people who would not have been able to afford cinematography in the past. even iPods film above the 30 MBps DCP standard so we can totally start projects that will have a baseline fidelity above that of other contemporaneous cinema.
thank you for reading and for your interest and consideration