• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AI Topic: AlphaZero, ChatGPT, Bard, Stable Diffusion and more!

64K

Joined
Mar 13, 2014
Messages
6,104 (1.75/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) MSI RTX 2070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Dell 27 inch 1440p 144 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
Apparently MS is using ChatGPT in their Bing Search Engine. They posted the following notice on the Bing Page:

"Let's learn together. Bing is powered by AI, so surprises and mistakes are possible. Make sure to check the facts, and share feedback so we can learn and improve!"

I imagine some of the surprises and mistakes are comical indeed.
 
Joined
Apr 24, 2020
Messages
2,379 (1.90/day)
Apparently MS is using ChatGPT in their Bing Search Engine. They posted the following notice on the Bing Page:

"Let's learn together. Bing is powered by AI, so surprises and mistakes are possible. Make sure to check the facts, and share feedback so we can learn and improve!"

I imagine some of the surprises and mistakes are comical indeed.

See the Reddit post on the first page!

What is fun about AI right now is that a large number of people are experimenting and sharing their experiences. Now the "Time Travel Glitch" is several months old, so I kinda expect it to be fixed by now. But back in February, ChatGPT / Bing was extremely confused about what date it was. So all sorts of weird glitches would come out if you asked it time-based questions.
 
  • Like
Reactions: 64K
Joined
Apr 24, 2020
Messages
2,379 (1.90/day)
Risk-v cpu designed for less of 5 hours with using AI.
This comment probably is also suitable to topic for that AI is dangerous for workers places but this topic is old, last comment is before more of two years abd I see no point to reviving it.


In this article, we report a RISC-V CPU automatically designed by a new AI approach, which generates large-scale Boolean function with almost 100% validation accuracy (e.g., > 99.99999999999% as Intel31) from only external input-output examples rather than formal programs written by the human. This approach generates the Boolean function represented by a graph structure called Binary Speculation Diagram (BSD), with a theoretical accuracy lower bound by using the Monte Carlobased expansion, and the distance of Boolean functions is used to tackle the intractability.

These guys invent a new Binary Decision Diagram (calling it a Binary Speculation Diagram), and then have the audacity to call it "AI".

Look, advancements in BDDs is cool and all, but holy shit. These researchers are overhyping their product. When people talk about AI today, they don't mean 1980s style AI. Don't get me wrong, I like 1980s style AI, but I recognize that the new name of 1980s-style is called "Automated Theorem Proving". You can accomplish awesome feats with automated theorem proving (such as this new CPU), but guess what? A billion other researchers are also exploring BDDs (ZDDs, and a whole slew of other alphabet-soup binary (blah) diagrams) because this technique is widely known.

"Chinese AI Team innovates way to call Binary Decision Diagram competitor an AI during the AI hype cycle". That's my summary of the situation. Ironically, I'm personally very interested in their results because BDDs are incredibly cool and awesome. But its deeply misrepresenting what the typical layman considers AI today (which is mostly being applied to ChatGPT-like LLMs, or at a minimum, deep convolutional neural networks that underpin techniques like LLMs).

------------------

Furthermore, standard BDDs can create and verify Intel 486-like chips quite fine. That's just a 32-bit function (64-bits with 2x inputs), probably without the x87 coprocessor (so no 80-bit floats or 160-bit 2x inputs). Modern BDD-techniques that's used to automatically verify say, the AMD EPYC or Intel AVX512 instructions are doing up to 3x inputs of 512-bits each, or ~1536-bits... and each bit is exponential-worst case for the BDD technique. So... yeah... 64-bit inputs vs 1536 isn't really that impressive.

-----------

In fact, the underlying problem is: with only finite inputs and their expected outputs (i.e., IO examples) of a CPU, inferring the circuit logic in the form of a large-scale Boolean function that can be generalized to infinite IO examples with high accuracy.

I literally have an example of this sitting in my 1990s-era BDD textbook. This isn't new at all. This team is overselling their achievement. Albeit my textbook is only on the "textbook" level Reduced Ordered Binary Decision Diagram, with a few notes on ZDDs and the like... but I'm not surprised that "new BDD-style" could lead to some unique advantages.

Now, BSD (or whatever this "Binary Speculation Diagram" thingy is) might be interesting. Who knows? New BDDs are discovered all the time, its an exceptionally interesting and useful field. Necessary to advance the state-of-the-art of CPU design, synthesis, testing. Furthermore, this is exactly the kind of technology I'd expect hardcore chip designers to be using (its obviously a great technique). But... its industry standard. This is what CPU researchers are studying / experimenting with every day for the past 40+ years, I kid you not.

------------

BTW: ROBDDs (and all data-structures based off of BDDs) are awesome. I'd love to divert this topic and talk about ROBDDs, their performance characteristics, #P complete problems, etc. etc. But... its not AI. Its automated theorem proving, its exhaustive 100% accurate search with perfectly accurate designs of perfectness.
 
Last edited:
Joined
Feb 18, 2005
Messages
4,617 (0.68/day)
Location
Ikenai borderline!
I literally have an example of this sitting in my 1990s-era BDD textbook. This isn't new at all. This team is overselling their achievement.
Why are you surprised? Overselling accomplishments is part of Chinese culture (and no, I'm not being racist/culturalist/whatever-ist); that's one reason why there's so much concern around the number of scientific papers being published from China.
 
Joined
Apr 24, 2020
Messages
2,379 (1.90/day)
https://www.reddit.com/r/programming/comments/146wn9s
The /r/programming subreddit was catching ChatGPT bots astroturfing pro-Reddit opinions.

1688576388048.png
 
Joined
Apr 24, 2020
Messages
2,379 (1.90/day)

Gizmodo.com launches "Gizmodo Bot", an AI Bot to automatically write articles. Oh look, a chronological listing of Star Wars films, sounds fun. Lets see what...

Star Wars: The Rise of Skywalker (2019) The third installment of the sequel trilogy, The Rise of Skywalker follows Rey as she faces off against the dark side of the Force in an epic battle.

Star Wars: The Clone Wars (2008) The animated series follows Anakin Skywalker and Obi-Wan Kenobi as they battle the Separatists during the Clone Wars.

Star Wars Rebels (2014) The animated series follows a group of rebels as they battle the Empire and search for a way to restore freedom to the galaxy.

Wow, this is hilariously bad and wrong.

Today, the article has been edited by hand and they put things up in the right order at least. But its not very good writing at all. I've got a link to archive.org so yall can see the bot's mistakes in its original glory.
 
Joined
Dec 29, 2022
Messages
222 (0.81/day)
Have you managed to "annoy" a chatbot? I think I "upset" Bard when I asked: "So... you're just a fancy database?".
Sorry, Bard!
 
Joined
Jul 30, 2019
Messages
1,678 (1.10/day)
System Name Not a thread ripper but pretty good.
Processor Ryzen 9 5950x
Motherboard ASRock X570 Taichi (revision 1.06, BIOS/UEFI version P5.00)
Cooling EK-Quantum Velocity - Nickel + Plexi, EK-Quantum Reflection PC-O11 D5 PWM, EK-CoolStream PE 360
Memory Nemix DDR4-3200 PC4-25600 2Rx8 ECC Unbuffered Memory (2 sticks, 64GB, micron)
Video Card(s) XFX Radeon RX 5700 & EK-Quantum Vector Radeon RX 5700 +XT & Backplate
Storage Samsung 2TB 980 PRO 2TB Gen4x4 NVMe, Samsung 2TB 970 EVO Plus Gen3x4 NVMe x 2
Display(s) 2 x 4K LG 27UL600-W (and HUANUO Dual Monitor Mount)
Case Lian Li PC-O11 Dynamic Black (original model)
Power Supply Corsair RM750x
Mouse Logitech M575
Keyboard Corsair Strafe RGB MK.2
Software Windows 10 Professional (64bit)
Benchmark Scores Typical for non-overclocked CPU.
Have you managed to "annoy" a chatbot? I think I "upset" Bard when I asked: "So... you're just a fancy database?".
Sorry, Bard!
Careful, skynet never forgets.
 
Top