Machine learning

Definition
Machine learning is a subset of artificial intelligence that enables systems to learn patterns from data and improve performance without being explicitly programmed. Instead of relying on fixed rules, machine learning models adapt and make predictions or decisions based on new inputs.
It is widely applied in industries such as finance, healthcare, retail, and technology for tasks like fraud detection, recommendation systems, forecasting, and personalisation.
Advanced
At an advanced level, machine learning uses algorithms that fall into categories such as supervised learning, unsupervised learning, and reinforcement learning. Supervised learning trains models on labelled data, unsupervised learning discovers hidden patterns in unlabeled data, and reinforcement learning optimises actions through trial and error.
Deep learning, a specialised form of machine learning, uses neural networks with multiple layers to process large datasets and handle tasks like image recognition, speech processing, and natural language understanding. Advanced applications require significant computational power, often using GPUs or cloud infrastructure.
Why it matters
Use cases
Metrics
Issues
Example
A streaming platform uses machine learning to analyse viewing history and recommend shows. The system learns user preferences over time, improving engagement and increasing subscription retention.