Top 5 AI stories I’m waiting for in 2023 | AI beat
View all sessions on demand from Smart Security Summit this.
Tomorrow morning I go south. Go straight down I-95, from central New Jersey to northeast Florida, where I’ll put my laptop in St. Augustine for the next two months. It’s as far away from Silicon Valley as possible in the continental US, but that’s where you’ll find me gearing up for the first artificial intelligence (AI) news of 2023.
Here are the 5 biggest AI stories I’ve been waiting for:
1. GPT-4
ChatGPT To be So 2022, don’t you think? The hype around OpenAI chatbots”study preview,” released on November 30, barely reached its peak, but the buzz surrounding what’s coming — GPT-4 — sounds like millions of Swifties waiting for Taylor’s next album. release.
If OpenAI’s expert predictions and cryptic tweets are correct, then in early to mid 2023, GPT-4 — with more parameters and trained on more data — will launch and “mind will be blown away.” It will still be filled with the unreliable “logical BS” of ChatGPT and GPT-3, but it can be multimodal — can work with images, text, and other data.
It’s been less than three years since GPT-3 has been released and only two since the first DALL-E research paper was published. When it comes to the rate of innovation of major language models in 2023, many are saying “seatbelt.”
2. EU AI Act
AI technology may be evolving rapidly, but so is AI regulation. While a series of State-based AI-related bills passed in the United States, it’s the larger government regulation — in the form of the EU AI Act — that everyone has been waiting for. On December 6, the EU AI Act took a step towards becoming law, after which the Council of the European Union passed amendments to the draft law, opening the door for the European Parliament.”complete their joint position.”
The EU AI Act, according to Avi Gesser, partner at Debevoise & Plimpton and co-chair of the company’s Cybersecurity, Privacy and Artificial Intelligence Practice Group, is trying to come up with a regime based risk to address the highest risk consequences of artificial intelligence. Intelligence. As for the GDPR, it would be an example of comprehensive European law coming into force and gradually incorporating various state and industry specific laws in the United States, Mr. recently said VentureBeat.
Boston Consulting Group calls the EU AI Act “one of the first wide-ranging regulatory frameworks on AI” and hopes it will be enacted into law in 2023. As it will apply whenever business activity occurs. business is made with any EU citizen, regardless of location, so this will likely affect nearly every business.
3. The search war
Last week, the New York Times called ChatGPT “code red” for Google’s search business. And in mid-December, You.com announced it opened its search platform to general AI applications. Then on Christmas Eve, You.com launched You chatwhich it is called “Conversational AI with quotes and real-time data, right in your search bar.”
To me, all of this adds up to a real battle for the future of search in 2023 — I’m chewing popcorn waiting for Google’s next move. As I wrote recently, Google processes billions of searches a day — so it’s not going away anytime soon. But perhaps ChatGPT — and even You.com — is just the beginning of new, imaginative thinking about the future of AI and search.
And like Alex Kantrowitz tell Axios Recently, Google may have to make a move: “It’s time for Google’s game,” he said. “I don’t think it can sit on the sidelines for too long.”
4. Open Source vs Closed AI
I am intrigued by the ongoing discussion around open source and closed AI. With the increasing development of Hugging Face’s open source model — the company has achieved $2 billion valuation in the May; Stable diffusion big summer sensation into the text-to-image space; and first open source copyright lawsuit Targeting GitHub CoPilot, the open source AI has had a big, impactful year in 2022.
That will certainly continue into 2023, but I’m most interested in how it compares to the evolution of closed-source AI models. After all, OpenAI transfer closed source and is currently preparing to release GPT-4, arguably the most anticipated AI model ever — this is definitely a competitive advantage, isn’t it?
On the other hand, MIT Technology Review guess “an open source revolution has begun to match, and sometimes surpass, what the wealthiest labs are doing.” Sasha Luccioni, research scientist at Hugging Face, agreed, adding that open source AI is more ethical. She tweeted last week that open source AI models “make it easier to find and analyze ethical issues, as opposed to keeping them closed source and saying ‘trust me, we’re filtering. all the bad stuff.’
5. Is AI running out of training data and computing power?
Will 2023 be the beginning of an era of AI innovation when it comes to data and computation?
ChatGPT’s computational cost, according to OpenAI Sam Altman, is “delicious”, while IBM said that we’re running out of computing power altogether, because while AI models are “growing exponentially,” the hardware to train and run them isn’t growing as fast.” Meanwhile, a research paper complain that “data commonly used to train language models could be exhausted in the near future—as early as 2026.”
I’m eager to see how this plays out next year. Wouldn’t big finally be better when it comes to data and computation? Will the new AI chip designed for deep learning models be a game changer? Will aggregated data be the answer to the training problem? I also prepared popcorn for this one.
Wish you all a happy new year!
I’ll be back at my temporary beachside “office” on January 2. Until then, enjoy the last week of 2022 and have a happy, healthy new year. As a reminder, I am on Twitter at @sharongoldman and can be reached at [email protected]
VentureBeat’s Mission is to become a digital city square for technical decision-makers to gain knowledge of transformative and transactional enterprise technology. Explore our Briefings.