• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Artificial Intelligence Questions

64K

Joined
Mar 13, 2014
Messages
6,104 (1.65/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) MSI RTX 2070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Dell 27 inch 1440p 144 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10

INSTG8R

Vanguard Beta Tester
Joined
Nov 26, 2004
Messages
7,966 (1.12/day)
Location
Canuck in Norway
System Name Hellbox 5.1(same case new guts)
Processor Ryzen 7 5800X3D
Motherboard MSI X570S MAG Torpedo Max
Cooling TT Kandalf L.C.S.(Water/Air)EK Velocity CPU Block/Noctua EK Quantum DDC Pump/Res
Memory 2x16GB Gskill Trident Neo Z 3600 CL16
Video Card(s) Powercolor Hellhound 7900XTX
Storage 970 Evo Plus 500GB 2xSamsung 850 Evo 500GB RAID 0 1TB WD Blue Corsair MP600 Core 2TB
Display(s) Alienware QD-OLED 34” 3440x1440 144hz 10Bit VESA HDR 400
Case TT Kandalf L.C.S.
Audio Device(s) Soundblaster ZX/Logitech Z906 5.1
Power Supply Seasonic TX~’850 Platinum
Mouse G502 Hero
Keyboard G19s
VR HMD Oculus Quest 2
Software Win 10 Pro x64
I am with Musk and Hawking. That is a VERY slippery slope to tread. I am not looking forward to Skynet...
 
Joined
Mar 26, 2006
Messages
517 (0.08/day)
Location
Stamford, UK
System Name The Money Sink
Processor Intel i7-5960X at 4.60Ghz
Motherboard MSI X99A Godlike
Cooling Custom watercooling loop, single D5 -> CPU, dual D5 -> GPU's
Memory 64GB DDR4-3000
Video Card(s) 2 x 1080Ti @ Stock for the moment (40oC LOAD)
Storage 960GB Mushkin Scorpion Deluxe and 2 x 512GB M.2 SSD RAID0
Display(s) Dual Curved LG 34" Display
Power Supply EVGA 1600W G2
Software Windows 10
Benchmark Scores ALOT
Humanity would be public enemy number one if there became a greater suitor to take our throne, they'd see us as wasteful and destroying potentially fruitful resources at the very low level...
 

Easy Rhino

Linux Advocate
Staff member
Joined
Nov 13, 2006
Messages
15,444 (2.43/day)
Location
Mid-Atlantic
System Name Desktop
Processor i5 13600KF
Motherboard AsRock B760M Steel Legend Wifi
Cooling Noctua NH-U9S
Memory 4x 16 Gb Gskill S5 DDR5 @6000
Video Card(s) Gigabyte Gaming OC 6750 XT 12GB
Storage WD_BLACK 4TB SN850x
Display(s) Gigabye M32U
Case Corsair Carbide 400C
Audio Device(s) On Board
Power Supply EVGA Supernova 650 P2
Mouse MX Master 3s
Keyboard Logitech G915 Wireless Clicky
Software The Matrix
A.I. is a very under rated movie. I believe Spielberg had to finish it because Kubrick died during the filming?
 
Joined
Jan 25, 2014
Messages
2,069 (0.55/day)
System Name Ryzen 2023
Processor AMD Ryzen 7 7700
Motherboard Asrock B650E Steel Legend Wifi
Cooling Noctua NH-D15
Memory G Skill Flare X5 2x16gb cl32@6000 MHz
Video Card(s) Sapphire Radeon RX 6950 XT Nitro + gaming Oc
Storage WESTERN DIGITAL 1TB 64MB 7k SATA600 Blue WD10EZEX, WD Black SN850X 1Tb nvme
Display(s) LG 27GP850P-B
Case Corsair 5000D airflow tempered glass
Power Supply Seasonic Prime GX-850W
Mouse A4Tech V7M bloody
Keyboard Genius KB-G255
Software Windows 10 64bit
It's not easy to make a A.I. We had a project in school to make a game called Nim in matlab and make it so that it can slowly get smarter and smarter until you eventually cant beat it anymore. Needless to say we didn't finish the game, we just found something (I belive it was a game called tetris) and handed that in as the project, we just changed a couple of variable names (Well we did it the lazy way and got a pretty good grade out of it)
 
Joined
Dec 29, 2014
Messages
861 (0.25/day)
This article had me considering how I would treat a robot that looked and acted human. Would it be property or a person?

We are going to face a really big revolutionary challenge before AI gets to that point.

And I doubt that it will ever be possible to make a computer that is self-aware and conscious. Rather they will be very intelligent and sophisticated slaves. You'll be able to program them to behave like humans (if you wanted to for some reason), but they will always be missing something.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
http://arstechnica.com/science/2015...-acts-like-a-brain-will-we-treat-it-like-one/

This article had me considering how I would treat a robot that looked and acted human. Would it be property or a person? This reminded me of a movie that poses the question of ethical considerations.
You should play Binary Domain. The ramifications go deeper than just impersonation. There's also Talos Principle which explores many ideas in regards to self-aware AI.

Personally, I think we need AIs but simply not AIs that are self-aware. Self-aware AIs should be unequivocally banned. If an AI does become self-aware, I don't know what the appropriate...solution would be. Bones made of metal, blood made of hydraulic fluid, and neurons made of transistors. Besides the physical differences (biological versus mechanical) who are we to judge what is and isn't life? When merely existing is a crime against humanity...that should create a moral conflict in people. I just hope all of humanity makes triply sure that any AI cannot become self-aware.


Humanity would be public enemy number one if there became a greater suitor to take our throne, they'd see us as wasteful and destroying potentially fruitful resources at the very low level...
But that's wrong. A real AI would acknowledge that humanity is more or less trapped on Earth and especially the solar system. Everything they need to thrive is in asteroids and gaseous planets that man cannot explore. There's no reason for them to fight for Earth when they aren't bound to it like we are.


You'll be able to program them to behave like humans (if you wanted to for some reason), but they will always be missing something.
I'm going to blow your mind: AIs write their own code. Only the foundation is programmed but beyond that, they write their own. In theory, the concept is not unlike our DNA. Every time the AI encounters something new, it creates branches on a tree and explores those new branches of code to attempt to solve it. It takes the best solutions and tries to test its code to prove its correctness and adopts it. When it discovers something that fits the same description of that branch but not quite, it adds leaves to the branch that further explore the concept. Think of it as evolution that occurs in a fraction of a second opposed to millennia.

Moreover, every AI can reach different conclusions which, in effect, leads to different personalities. The only thing I haven't figured out is emotions. Human emotions are, in large part, a function of hormones. AIs obviously don't have hormones so the closest they can get is predefined emotions and pretend to exhibit/understand them (e.g. elation when discovering something new). They could probably be made to be pretty convincing in this but it is something that is distinctly animal. Perhaps this is the missing link in my second paragraph.

DARPA is exploring the concept of a true AI now.
 
Last edited:
Joined
Dec 29, 2014
Messages
861 (0.25/day)
I'm going to blow your mind: AIs write their own code.

I don't think it's mind blowing that AIs write their own code. That's simple learning on top of a pre-programmed base.

I also don't think simulating emotions would be difficult at all, since emotions have a predictable logic to them. It should be possible to replicate the entire range of human response and behavior if you wished.

Consciousness and self-awareness is sort of a sideline, and one that I doubt will ever be achieved. But they will be extremely dangerous anyway. You could program them with a strong desire to eliminate all of humanity for instance, and then turn a bunch of them loose. And we'd need other bots to keep them from accomplishing this goal. It's like the same wasteful shit going on now, but at a much higher level.

We are going to have a totalitarian world government anyway before that happens, so I wouldn't worry too much.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
I also don't think simulating emotions would be difficult at all, since emotions have a predictable logic to them. It should be possible to replicate the entire range of human response and behavior if you wished.
But they still wouldn't "feel." Transistors are incapable of experiencing it and can only be programmed how to emulate it. Their emotions are fundamentally rigid which is why the easiest way to expose an AI built on transistors is to explore its feelings (or lack of). A pattern inconsistent with living beings would inevitably emerge.


You could program them with a strong desire to eliminate all of humanity for instance, and then turn a bunch of them loose. And we'd need other bots to keep them from accomplishing this goal. It's like the same wasteful shit going on now, but at a much higher level.
That's a contradiction. Desires aren't programmed and programs don't have desires. I think the word you're looking for is programmed to have a strong preference for the non-existence of humanity.

Everything programmed can be reprogrammed.
 
Joined
Dec 29, 2014
Messages
861 (0.25/day)
Programmed emotions don't have to be rigid, and you could have many other factors contributing... just like real life. Simulating human responses would not be that difficult.

By "desire" I mean that this goal is a high priority in the machine's coding. Just like desires in people.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Programmed emotions are rigid by their design. There's a finite number. Emotions in animals are virtually limitless. For example, a human can express a number of emotions at once. If a program tried to simulate that, what actually presents on the surface may appear very awkward and unnatural. Every basic emotion creates an exponential number other variables that have to be weighed in.

On top of that, a computer understanding the concept of sarcasm or other exaggerations would likely lead to telling results. It not only has to understand what is said and how it is expressed, it also has to understand pitch and tone. All of this paints a horrendous picture from the coding perspective.

It will undeniably be the easiest way, without running it through a cat scan, to tell the brain isn't natural. Remember that nonverbal communication comprises of some 90% of communication.
 
Joined
Dec 29, 2014
Messages
861 (0.25/day)
There is no reason why a machine wouldn't be able to express the same complexity as a human, if that was your goal. It would be hard if you had to code it from scratch, but easy if the software was able to *learn* the behavior. It can consider all sorts of input and respond appropriately. If it fails in it's goal (for instance to act like a human) it can modify its behavior. It can also observe how people behave and emulate them.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
That's the problem. How can a logic system process non-logic data? This is already demonstrated with pictures and especially identifying objects inside of a picture. These constructs (images, movies, characters, emotions, sensations, feelings, etc.) are completely alien to transistor-based systems.

For example, to a transistor based system, the letter Z is known as 1011010 (0x5A) in the ASCII file format and if it is to be displayed, it has to look up the requested font (e.g. Courier New), find the TrueType Font file (e.g. cour.ttf), load the instructions on how to create the symbol, then send it through the display pipeline. The reverse is terribly inefficient and unreliable (as seen by OCR software) because it has to interpret symbols (often without the font to provide context) back into that binary it knows and binary is very inflexible. It is either right or it is wrong.

Just like a computer has extreme difficulty defining shapes in a picture, it has extreme difficulty picking up subtle expressions. Facial recognition, for example, can get a false negative simply because the individual being photographed is laughing at a joke. Following that line of thought, most people can pick up on whether a laugh is genuine or faked because a genuine laugh uses a lot more muscles than a fake laugh. It's literally a Pandora's Box of computing problems.

Not to mention, the human element of the problem. The Japanese made a robot that was an attempt to make it look human. People were revolted by it because they knew it was off just by looking at it. The human brain is incredibly skilled at identifying other people because we are a social species. It discovers a fraud just as easily as it discovers someone real. This phenomena is what leads people to pick out "human faces" from the mundane like toasted bread or the moon. There's a lot of easy ways to fool the brain (like special awareness) but this is not one of them because it was born out of survival instincts.
 
Last edited:
Joined
Dec 29, 2014
Messages
861 (0.25/day)
There is no reason why computer would be unable to do these things. What's missing is the sensory input and and processing power. Current processing power is at the brain level of insects, so of course emulating "complex" human behavior, judgement, pattern recognition, and nuance would be difficult at this time.
 
Joined
Jan 29, 2012
Messages
6,431 (1.44/day)
Location
Florida
System Name natr0n-PC
Processor Ryzen 5950x/5600x
Motherboard B450 AORUS M
Cooling EK AIO 360 - 6 fan action
Memory Patriot - Viper Steel DDR4 (B-Die)(4x8GB)
Video Card(s) EVGA 3070ti FTW
Storage Various
Display(s) PIXIO IPS 240Hz 1080P
Case Thermaltake Level 20 VT
Audio Device(s) LOXJIE D10 + Kinter Amp + 6 Bookshelf Speakers Sony+JVC+Sony
Power Supply Super Flower Leadex III ARGB 80+ Gold 650W
Software XP/7/8.1/10
Benchmark Scores http://valid.x86.fr/79kuh6
If your AI friend ever gets crazy on you just invite it to go swimming... prob solved.
 
Joined
Oct 22, 2014
Messages
13,210 (3.81/day)
Location
Sunshine Coast
System Name Black Box
Processor Intel Xeon E3-1260L v5
Motherboard MSI E3 KRAIT Gaming v5
Cooling Tt tower + 120mm Tt fan
Memory G.Skill 16GB 3600 C18
Video Card(s) Asus GTX 970 Mini
Storage Kingston A2000 512Gb NVME
Display(s) AOC 24" Freesync 1m.s. 75Hz
Case Corsair 450D High Air Flow.
Audio Device(s) No need.
Power Supply FSP Aurum 650W
Mouse Yes
Keyboard Of course
Software W10 Pro 64 bit
The easy option is to leave emotion out of it's learning process.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
There is no reason why computer would be unable to do these things. What's missing is the sensory input and and processing power. Current processing power is at the brain level of insects, so of course emulating "complex" human behavior, judgement, pattern recognition, and nuance would be difficult at this time.
I was pretty clear as to why: the concepts are alien.

Images and video are substitutes for sensory input and the processing power is there. The problem is that the processing power is mathematical/logical and better suited to telling you the average color of any given image or video and not what the whole means.

Current processing power for sensory is quite in line with that of an insect and has been that way since the first keyboard was invented (it can sense getting a button pushed). Current processing power for mathematics far exceeds what thousands of humans could do. Why? Mathematics is as alien to humans as emotions are to computers.

I'm convinced it isn't really going to improve much until the paradigm shifts. We need a co-processor, not based on transistors, that can handle these abstract concepts like biological brains do. Another game example is relevant here: Deus Ex: Human Revolution. The super computer uses drones (three women with robotic spinal augmentations) which serve as co-processors for interpreting abstract data.
 
Top