A list of puns related to "Open Archives Initiative Protocol For Metadata Harvesting"
in 2019, at Adobe MAX event they announced the Content Authenticity Initiative (CAI), a technology movement to help combat fake news, misinformation with digital assets. Their proposed solution, is embedding metadata in the digital assets themselves making them traceable and verifiable information containers.
In 2021, Adobe and a bunch of other tech companies like Microsoft gathered together to form the Coalition of Content Provenance and Authenticity (C2PA) and released the first draft of the spec to the public this week.
This is a python library that injects CAI metadata into JPEG (images) according to the recently published specs. It's been made available for developers to view the source code and play with.
GitHub: https://github.com/numbersprotocol/pyc2pa
PyPi: https://pypi.org/project/c2pa/
Examples of Injected Photo (multi-injection): https://user-images.githubusercontent.com/292790/131797706-937ac2ef-e57c-4fe6-9842-2941deba6cec.jpg
Metadata Reader / Verification: https://verify.contentauthenticity.org/
C2PA Information (3 Injections): https://user-images.githubusercontent.com/292790/131798257-21159c2a-a958-431b-aaea-1649b27aaaaf.png
Use Case: Photo Journalism, News, Combatting Deep Fakes, Digital Asset Infringement Protection
Real World Uses:
CAI injection was used in a case study 78 Days (https://www.starlinglab.org/78days/), a 78 Day photojournalistic account of 2020 - 2021US Presidential transition. An earlier version of this tool was utilized to perform the CAI injections and make the photos traceable assets.
Resources:
JUMBF Metadata (theory behind injecting information into JPEG) : https://medium.com/numbers-protocol/cai-series-1-how-to-inject-jumbf-metadata-into-jpg-c76826f10e6d
CAI Whitepaper: [https://static1.squarespace.com/static/5e531aad6e028a2ed29c3e49/t/5f27fc6c0aec8e324025453f/1596456047676/CAI_WhitePaper_v1_0.pdf](https://static1.squarespace.com/static/5e531aad6e028a2ed29c3e49/t/5f27fc6c0aec8e324025453f/1596456047676/
... keep reading on reddit β‘Alice and Bob want to have a private conversation but they also don't want anyone to know they're talking to each other.
I'm assuming that they can use some public key cryptography protocol that's sufficient to ensure their conversation is indeed private. Alice encrypts her messages using Bob's public key, and her encrypted messages can only be decrypted with Bob's private key.
But what about the metadata, ie the who/what/when/where information that we now know is collected routinely by the NSA, and which allows an adversary to determine that Alice and Bob are in communication?
As I understand it, there are several more or less practical ways to obscure the metadata -- including the identity of the intended recipient -- of Alice's and Bob's messages. These methods include steganography, TOR/pluggable transports, anonymizing email services and metadata encryption. But all of these approaches have weaknesses (eg trust issues, the existence of a central point of attack, susceptibility to traffic analysis), and as long as Alice's messages are ultimately being delivered to Bob (and vice versa), then any adversary who could discover this would know that Alice and Bob were in communication.
But what if Alice sent her encrypted messages not only to Bob, but to everyone ( * )? And everyone received them ( ** )? Public key encryption would ensure that only Bob would be able to actually decrypt and read the message, and meanwhile even an adversary with complete access to the entire network between Alice's and Bob's machines would still be unable to determine which particular instance of 'everyone' was the intended recipient. In other words, from the outside, an adversary would not be able to determine who Alice was talking to.
( * ) 'Everyone' here means 'everyone who's participating in this protocol'. Obviously, as with TOR, the more participants the better. This protocol would be trivially useless with only two users. But even three users would provide some protection. (Is Alice talking to Bob or to Carol?) And it would work a whole lot better if Bob were literally one in a million.
( ** ) Or rather: everyone's machine/device automatically received them. Each machine/device would then attempt to decrypt all incoming messages, and non-decipherable messages would automatically be discarded(***). The user would only be notified if the message was in fact for them.
(***) Or, more efficiently, be forwarded to a swarm of peers in a process that would be ana
... keep reading on reddit β‘Very exciting stuff for any DIY loopers. Here is the press release.
I wonder if this means that the OmniPod will finally be usable with OpenAPS / AndroidAPS / Loop. OmniPod is developed by Insulet, not Ypsomed, but the latter has distribution rights and is a major player, so hopefully they can put on the screws. Also, Ypsomed has their own pump.
I have been reading about waste composting and biogas plants lately
It occurred to me that there is some unused space in my apartments and it seems sufficient for such initiatives
I have been thinking why are such initiatives esp. composting not implemented in colonies at large. Every apartment generates kitchen waste daily and it is a big hassle to dispose it off. Also very environment un-friendly when it is very easy to dispose off via composting.
Would like to know how has the experience been to implement them. It seems to me that it is not easy to get cooperation from residents at large
Also biogas and compost can have financial benefits and may also get support from authorities, would like to know how do people think in respect to that
Hey guys,
I was looking through this sub but it seems like this never came up/my queries were wrong. Basically I am looking for something like minio where I just can dump all my stuff I want to archive and attach metadata to it so I can query later on i.e. "show all pdfs from 2018", "give me all pictures which I made in London 2017".
I tried numerous directory layouts but when it makes sense to me it does not necessarily to my family. So a global search would be the solution.
I do not need the program to do any of the metadata extraction, it is fine for me to manually apply this when uploading (like minio does). However the minio devs discouraged me from using minio for this since minio seems not to scale very good when it comes to metadata. It is pretty much a dumb data dump - as intended.
Any advice where I might find something like this, I could write it myself but can not imagine that nobody made something like this before. A poor man's solution would be a WebDAV + sqlite DB + webfrontend.
PUBLISH, Inc. is blockchain-based software solutions provider for newspaper businesses. Its mission is to secure the editorial and financial independence of newspaper businesses using blockchain technology.
Please note that this site uses cookies to personalise content and adverts, to provide social media features, and to analyse web traffic. Click here for more information.