Galaxy Digital Holdings Ltd.

11/04/2024 | Press release | Distributed by Public on 11/04/2024 14:57

Backing OpenOrigins to Restore Trust in Digital Content

The Double-Edged Sword of Generative AI

There is no doubt that generative AI is revolutionary. However, we're entering a world where seeing is no longer believing. Perhaps the most significant question is: how can we ensure trust and verify authenticity in this rapidly evolving digital landscape?

According to Sumsub, last year saw over 1000% increase in the number of deepfakes detected globally across all industries, with the cost of creating deepfakes having dropped to nearly zero. Unfortunately, the 2024 U.S. Election has further highlighted just how serious the epidemic of deepfakes has become - from Taylor Swift's public denouncement of a deepfake post claiming her endorsement, to FBI's warning about fake videos on ballot fraud, to synthesized and manipulated voices of both presidential candidates spreading political disinformation.

Beyond the immediate risks, there is also a critical need to secure and authenticate real-world data used for training and fine-tuning AI models. Researchers from the Universities of Oxford and Cambridge have discovered that even the inclusion of 10% synthetic (or fake) training data can degrade AI performance significantly, leading to issues like "model collapse."

Source: Adobe

Existing Solutions: Why They Fall Short

While there are notable attempts to combat deepfakes and offer authentication solutions for digital content, the approaches we've seen have significant limitations:

Approach 1: Detection (using AI to verify pixels), a reactive mechanism that relies on constantly updating models - which is a classic "whack-a-mole" tactic. The AI models will have to continuously evolve and retrain to catch up with the ongoing deepfake advancements, and eventual undetectable generated output. Even worse, detectors only ironically assist deepfake generators to become better at disguising anomalies. Take a close look at the gallery below - can you tell which one is authentic? Pixel-level differences are quickly becoming undetectable in a short amount of time.

Source: Europol Innovation Lab

Approach 2: Crowd-sourcing Play. Some solutions utilize a crowd-sourcing strategy to verify content against decentralized databases or protocols. However, crowdsourced reporting often results in inconsistent detection, and these approaches may lack the comprehensive detection required to counter fast-evolving deepfake techniques. As such, they fall short of the robustness and scalability needed for enterprise-grade applications, which demand scalable, timely, consistent, and high-confidence authentication to ensure media content integrity across vast data volumes.

Approach 3: Watermarking. One prevalent watermarking option is Content Credentials (CR) set by C2PA standards. While we closely monitor the development, we recognize its current limitations in capturing lower-level metadata and its strong association with the Adobe ecosystem, with Photoshop used as the sole compliant tool for editing images - addressing only a fraction of the millions of images generated each year. Additionally, watermarks can be easily removed or copied by fraudsters, diluting the effectiveness of a genuine watermark.

"Declaration", not "Detection"

We are excited to support OpenOrigins in its full stack, immutable approach to establish trust in digital content. OpenOrigins' innovative IP, developed by Dr. Manny (Mansoor) Ahmed, founder and CEO of OpenOrigins, represents the culmination of over eight years of research. Dr. Ahmed, a Postdoc in Distributed Computing and PhD in Computer Security from the University of Cambridge, began studying Deepfakes in 2017. He soon realized that detection alone would only lead to an ever-escalating arms race and that a completely different approach was needed.

Instead of reactive detection, OpenOrigins offers a fundamentally different and proactive solution by focusing on declaring authenticity at the point of capture. It addresses the challenge with a uniquely multi-source tamper-proof infrastructure, ensuring real-time security at the source of generation that establishes a root of trust for content archives. This approach combines TEE (trusted execution environments), device-based sensors, cryptography, and blockchain to build trust in online content from the ground up.

One of the key innovations of OpenOrigins is at the layer of trusted compute, a hardware-based security chip embedded in smartphones, PCs, and other devices. OpenOrigins leverages existing sensors for movement (e.g. gyroscope, accelerometer, magnetometer, pressure sensor) and/or depth sensors (LiDAR, radar, sonar, etc.) to capture metadata generated at the device, all securely validated and encrypted by TPM before being written into their proprietary distributed system.

Source: OpenOrigins

Building on this robust foundation of device-level security, OpenOrigins offers its Archive Anchoring as a Service for media and content archives. Assets and their metadata are registered on a scalable, decentralized network. This ensures that they are immutably anchored and protected from tampering and manipulation, through verification of their authenticity and provenance.

To give customers full control, each customer hosts a node that they can independently maintain, keeping the archive securely in their hands. This configuration also allows external parties to facilitate independent verification of content integrity.

Once anchored and authenticated, media can be credibly licensed as provably human and copyright-compliant source data for AI training. With the growing demand for high-quality, authentic, non-synthetic data for AI model training, OpenOrigins is well positioned at the center of the flywheel which enables both the supply and demand side of the AI training market as a data marketplace.

Trusted by Enterprises and Customers

OpenOrigins has seen significant interest from global media companies. They are partnered with ITN, one of the UK's leading content producers behind Channel 4 and ITV, to protect the human origins of their media archives spanning the past 70 years and consisting of over a million video and audio assets. ITN can now unlock a new revenue stream by licensing its secured archives to AI companies that are responsibly training their models via OpenOrigins' marketplace.

OpenOrigin's ability to win the trust of media partners was also demonstrated by its recent partnership with Fathm (see the public announcement), a leading media innovation organization that works with global newsrooms and media platforms, such as Graham Media Group, Google News Initiative, and AFP. Through the partnership with Fathm, OpenOrigins will help newsrooms of all sizes take advantage of new AI-driven licensing opportunities, ensuring that content remains secure, ethical, and profitable

Last Line of Defense Against Deepfakes with Scalable AI and Blockchain

Our investment in OpenOrigins aligns with our vision of harnessing the transformative power of AI, device based-and distributed security, cryptography and blockchain technology. Their innovative and scalable technology has potential applications far beyond the media sector, expanding its use in sectors such as neural network authentication, customer onboarding/KYC, fraud mitigation, remote infrastructure monitoring, and visual inspection, etc. While their innovations present exciting opportunities, it is essential to acknowledge that challenges remain in the fight against deepfakes and digital misinformation.

It's often said that the best offense is a good defense. We believe OpenOrigins could be the last line of defense against the rapidly evolving threat of deepfake technology.

Feel free to reach out to [email protected]and [email protected]if you have any questions or insights about deepfake, GenAI, decentralized infrastructure, or distributed computing. We welcome any idea, suggestion, and opportunity to work together to build a more trustworthy, connected, and intelligent internet.

*OpenOrigins is a Galaxy Interactive and Galaxy Ventures portfolio company.

Legal Disclosure:
This document, and the information contained herein, has been provided to you by Galaxy Digital Holdings LP and its affiliates ("Galaxy Digital") solely for informational purposes. This document may not be reproduced or redistributed in whole or in part, in any format, without the express written approval of Galaxy Digital. Neither the information, nor any opinion contained in this document, constitutes an offer to buy or sell, or a solicitation of an offer to buy or sell, any advisory services, securities, futures, options or other financial instruments or to participate in any advisory services or trading strategy. Nothing contained in this document constitutes investment, legal or tax advice or is an endorsement of any of the digital assets or companies mentioned herein. You should make your own investigations and evaluations of the information herein. Any decisions based on information contained in this document are the sole responsibility of the reader. Certain statements in this document reflect Galaxy Digital's views, estimates, opinions or predictions (which may be based on proprietary models and assumptions, including, in particular, Galaxy Digital's views on the current and future market for certain digital assets), and there is no guarantee that these views, estimates, opinions or predictions are currently accurate or that they will be ultimately realized. To the extent these assumptions or models are not correct or circumstances change, the actual performance may vary substantially from, and be less than, the estimates included herein. None of Galaxy Digital nor any of its affiliates, shareholders, partners, members, directors, officers, management, employees or representatives makes any representation or warranty, express or implied, as to the accuracy or completeness of any of the information or any other information (whether communicated in written or oral form) transmitted or made available to you. Each of the aforementioned parties expressly disclaims any and all liability relating to or resulting from the use of this information. Certain information contained herein (including financial information) has been obtained from published and non-published sources. Such information has not been independently verified by Galaxy Digital and, Galaxy Digital, does not assume responsibility for the accuracy of such information. Affiliates of Galaxy Digital may have owned or may own investments in some of the digital assets and protocols discussed in this document. Except where otherwise indicated, the information in this document is based on matters as they exist as of the date of preparation and not as of any future date, and will not be updated or otherwise revised to reflect information that subsequently becomes available, or circumstances existing or changes occurring after the date hereof. This document provides links to other Websites that we think might be of interest to you. Please note that when you click on one of these links, you may be moving to a provider's website that is not associated with Galaxy Digital. These linked sites and their providers are not controlled by us, and we are not responsible for the contents or the proper operation of any linked site. The inclusion of any link does not imply our endorsement or our adoption of the statements therein. We encourage you to read the terms of use and privacy statements of these linked sites as their policies may differ from ours. The foregoing does not constitute a "research report" as defined by FINRA Rule 2241 or a "debt research report" as defined by FINRA Rule 2242 and was not prepared by Galaxy Digital Partners LLC. For all inquiries, please email [email protected]. ©Copyright Galaxy Digital Holdings LP 2024. All rights reserved.