To rely on brilliant statements from The Guardian and Think China is a true absurdity. Cheerleading for China is even more absurd. You need to learn about tech before you open your big mouth.
Thanks for the comment. The point wasn't cheerleading for China, it was pointing out that comments were there in recent months. Regarding The Guardian, we're all entitled to opinions/observations. Go back and see my work on Alibaba. Cheers.
Two things. First, for a country that asked us to believe C-19 came from bat stew, I find it hard to believe DeepSeek was developed in 2 months using off-the-shelf chips at a cost of $6 million. While the tech bros might be fighting, they certainly are not idiots. They wouldn't have spent enormous sums to build these LLMs if there was a similar solution available at a microscopically smaller cost. Sorry, but the Chinese claim is the "cold fusion" ruse all over again.
Second, ask DeepSeek what happened in Tiananmen Square in 1989. The answer you get should end all debate about DeepSeek being any sort of competition to the established AI apps.
Nice piece, Herb. As I mentioned yesterday, I find it amusing that the market reacted yesterday as though every claim made by China or Chinese companies is 100% factual. The truth likely lies in the middle (of all the topics you mentioned).
Herb, insightful as usual. The time will tell whether what you call "the beginning of the end" is already there, but it my unqualified non-engineering view its is too early to call the end of the OpenAI era. I mention OpenAI (vs other models) specifically because the model is already integrated into Microsoft products. In my opinion, this alone should drive the demand.
Would Microsoft integrate an open source Chinese program? I am a bit skeptical, but I was wrong before. Yes, the large model is expensive to train, but frankly most users are likely to defer to a closed set of AI application for which smaller, more cost-efficient models can be used.
To rely on brilliant statements from The Guardian and Think China is a true absurdity. Cheerleading for China is even more absurd. You need to learn about tech before you open your big mouth.
Thanks for the comment. The point wasn't cheerleading for China, it was pointing out that comments were there in recent months. Regarding The Guardian, we're all entitled to opinions/observations. Go back and see my work on Alibaba. Cheers.
Two things. First, for a country that asked us to believe C-19 came from bat stew, I find it hard to believe DeepSeek was developed in 2 months using off-the-shelf chips at a cost of $6 million. While the tech bros might be fighting, they certainly are not idiots. They wouldn't have spent enormous sums to build these LLMs if there was a similar solution available at a microscopically smaller cost. Sorry, but the Chinese claim is the "cold fusion" ruse all over again.
Second, ask DeepSeek what happened in Tiananmen Square in 1989. The answer you get should end all debate about DeepSeek being any sort of competition to the established AI apps.
Nice piece, Herb. As I mentioned yesterday, I find it amusing that the market reacted yesterday as though every claim made by China or Chinese companies is 100% factual. The truth likely lies in the middle (of all the topics you mentioned).
Such a great article. A string of hope on valuation.
Herb, insightful as usual. The time will tell whether what you call "the beginning of the end" is already there, but it my unqualified non-engineering view its is too early to call the end of the OpenAI era. I mention OpenAI (vs other models) specifically because the model is already integrated into Microsoft products. In my opinion, this alone should drive the demand.
Would Microsoft integrate an open source Chinese program? I am a bit skeptical, but I was wrong before. Yes, the large model is expensive to train, but frankly most users are likely to defer to a closed set of AI application for which smaller, more cost-efficient models can be used.
We've all been wrong, Olga. Way too early to figure out which way all of this will go!