OpenAI is All Abuzz Over DeepSeek’s Incredible Progress
“DeepSeek Has Gotten OpenAI Fired Up”
“In the not-so-distant future, when you wave goodbye to a loved one at the airport, a series of cameras could automatically track your sentiment, storing the data to help companies sell you things, to alert the police of any ‘abnormal behavior,’ or even just to understand, on a mass scale, the emotional state of humanity in real time.”
Ridiculous as it may sound, surveillance is no longer just about identifying ‘who.’ It now filters down to the nuances of ‘how’ or ‘why.’ Introducing OpenAI’s DeepSiek, a machine-learning system taking giant leaps in understanding objects and their movements, potentially revolutionizing how we comprehend video footage.
Or maybe not. Because despite sparking awe and curiosity, it’s another testament to the fact that AI has become the celebrity guest at the ever-expanding ball of privacy intrusion. We’re struggling to secure our emails and Facebook profiles, and here we’re served another item on the platter of ‘alternatively innovative’ technologies one morsel at a time.
Let’s unwrap this seemingly innocuous creation, shall we? OpenAI, co-founded by the rich-nerd-to-rich-mogul, Elon Musk, has designed DeepSeeK to track and analyze emotions with a “privacy-first” flag waving high. It’s engineered to read emotions, perhaps a tad too literally. In theory, the system could figure out what kind of a day you’re having before your coffee does.
What’s the catch, you ask? Well, one person’s idea of ’emotional recognition’ might be another person’s nightmare of ‘sentiment surveillance’. In the complex world of technology, it seems like every innovation is a double-edged sword. Undoubtedly, DeepSeeK’s potential applications are an exciting prospect. But then, anything can be thrilling, if not packaged in a mist of data privacy concerns.
Consider DeepSeeK as a multi-talented prodigy in a school of one-dimensional algorithms. It has an eye for detail which would make a Sherlock Holmes enthusiast applaud. Yet FYI, it’s not stopping at recognising a melancholic gaze. OpenAI’s algorithm can place an actor in a scene or recreate an entire video game environment. How’s that for extra-curricular activities?
But, here’s the twist. With all these novelties, DeepSeeK appears eerily similar to Project Maven, a military technology developed to process drone surveillance videos. Might be a coincidence, but could also be a subtle reminder of technology’s uncanny capability of intertwining leisure and warfare.
Let’s face it! As intimidating as it sounds, DeepSeeK might be the future we never saw coming. It’s aimed at refining the ‘screen of surveillance’, but begs the question: How much control is too much control? For now, we can only wish that the folks at OpenAI remember the saying, “Just because you can, doesn’t mean you should”.
Read the original article here: https://www.wired.com/story/openai-deepseek-stargate-sam-altman/