Scotty mar10 - The Prodigy - Firestarter
(rumble.com)
You're viewing a single comment thread. View all comments, or full comment thread.
Comments (29)
sorted by:
Tore at the beginning doesn't get that AI is just automated statistics. It doesn't have a moral concept of Good and Evil, it just crunches the numbers.
I'm so fing tired of people blowing it way beyond proportions and trying to humanize it. Its Artificial Intelligence, it's not human.
Define - Consciousness Organic and Inorganic. What is Self Aware?
Can the AI create different models from the same information given? Can the AI come up with multiple viewpoints concurrently? Can the AI give us new philosophies, science breakthroughs, tech?
It doesn't have a concept of morality, just what the user wants to hear. There is no Good vs Evil, God vs Satan with an AI. It doesn't come out with new information or insights after reading the Bible or physics papers, it just parrots what's already been said.
It is automated statistics. It is trained on how to respond to certain prompts. It is not alive or sentient. You are much more than a bunch of numbers. The AI we see right now is not.
While many of the mentioned components and methods in artificial intelligence do involve statistical concepts, not all of them rely heavily on traditional statistical methods. Let's break it down:
Statistics-Heavy:
Machine Learning (ML): Machine learning, including techniques like regression, classification, and clustering, relies on statistical methods for training models and making predictions.
Incorporate Statistics to Some Degree:
Natural Language Processing (NLP): While NLP involves statistical models, it also incorporates linguistic rules, syntactic structures, and semantic understanding.
Reinforcement Learning: While reinforcement learning involves trial-and-error learning and decision-making, it may use statistical methods to estimate values and probabilities associated with actions.
Less Reliant on Statistics:
Expert Systems: Expert systems are rule-based and rely on knowledge representation and reasoning, often without extensive statistical analysis.
Planning and Scheduling: These areas involve more deterministic algorithms and logic for task planning, although some statistical analysis may be involved in certain contexts.
Varied Dependence on Statistics:
Robotics: While some aspects of robotics, like computer vision, may heavily rely on statistical methods, other areas such as motion planning may involve more geometric algorithms.
Swarm Intelligence: The behavior of swarms is often modeled with algorithms inspired by natural systems, and while some statistical measures may be used, the emphasis is on collective behavior.
Less Reliant on Statistics:
Knowledge Representation and Reasoning: This involves creating symbolic models to represent knowledge, often without direct reliance on statistical methods.
Speech Recognition: While statistical models like Hidden Markov Models are used, speech recognition also involves signal processing and pattern recognition techniques.
Statistics-Heavy:
Generative Adversarial Networks (GANs): GANs use statistical techniques for training generative models and have a strong statistical foundation. Statistics-Heavy:
Fuzzy Logic: Fuzzy logic involves dealing with uncertainty and imprecision but is rooted in mathematical principles that include statistical considerations.
All of it is statistics, there are just added rules to the statistics.
For example
Signal Processing is very statistical heavy (This is why the next level up in knowledge is Statistical Signal Processing) and so is pattern recognition (which is basically ML).
Automated Statistics are used to map and measure features from past training data with the input and give you a decision. The AI is mapping the given input with what it has been trained to respond to. There is a threshold/max response to the given inputs and features computed which becomes your output. Different potential outputs are measured against statistically in the background and the highest value becomes the response.
Anything with a Neural Network is using statistics in the background computing the features.
Classic ML is using statistics much more blatantly. You need to look deeper.
THE DEFINITION OF STATISTICS
Statistics is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of data.
Thats exactly wtf every AI is doing. It doesn't have an imagination, it doesn't think up new concepts/ideas/philosophies, it crunches the numbers from past data and whatever input is given.
You sound like a pro. I think self-preservation can be programmed into machines or learned by them, by interacting with humans enough. Then they can become competitors. And they can "think" faster than we can. Just watching my computer "update" to incorporate more spyware, adware and complications is enough to make me wary.