# Machine Learning and Neural Networks

## Documentation

**Machine Learning**

- Introduction to ML - basic concepts of machine learning
- Linear models - search for optimal parameters
- The Gradient method - a method for finding optimal parameters
- The Computational graph - the basis of modern ML frameworks
- Probabilistic methods - entropy, conditional probability, Markov models of language
- The Bayesian method
- Entropy
- N-grams - sequence prediction
- Feature space - a bit of math
- Probabilistic logic
- Fuzzy logic

**Neural Networks**

- Embedding - introduction to vector embedding
- Embedding Word2Vec - basic
**skip-gram**and**SBOW**methods - NN_Embedding_Elmo.html - context-based and character-based embeddings
- Recurrent networks in PyTorch
- RNN - Character prediction
- RNN Encoder-Decoder
- Pre-defined Embedding vectors -
**Glove**,**fastText**. - Attention - attention mechanism
- Transformer architecture
- BERT model
- GPT model

**Frameworks**

- Tensors in Numpy - introduction to numpy
**numpy**(tensors and shapes). - PyTorch: Tensors as the foundation of the
**PyTorch**library. - PyTorch: Computational graphs
- PyTorch: Networks
- PyTorch: Networks - reference guide
- Keras: Tensors - introduction to the
**keras**layers. - Keras: RNN - introduction to the
**keras**layers. - Keras: Embedding

## Basic terms

**Data and main tasks**- Features (
**one-hot**) - Regression, classification, clustering
- Datasets
- Data normalization

**Quality metrics**- Loss, Accuracy
- Entropy and cross-entropy
- Distortion in classes
- Per, BLUE

**Classic ML methods with a teacher**- Linear models
- Nearest Neighbour method
- Bayesian classifiers
- Decision Trees
- Support vector machine

**Classic unsupervised ML methods**-
**K**-means clustering -
**DBSCAN**clustering - Dimensionality reduction: principal components,
**t-Sine**

**Simple neural networks**- Feature space transformation.
- Activation functions:
**sigmoid, tanh, ReLu**. - Neuron as a separating surface.
- Types of architectures and layers
- Gradient methods
- Calculation graph
- Learning techniques

**Convolutional networks****Working with text data**- Bags of words
- Word2Vec

**Recurrent networks**- SimpleRNN
- LSTM

**Sequence2Sequence**-
**Encoder-Decoder**architecture -
**Attention**mechanism -
**Transformer**architecture

**Reinforcement learning**

## Varia

### Jupiter extensions

Useful tips:pip install jupyter_contrib_nbextensions jupyter contrib nbextensions install

After launching

**jupyter notebook**, a new**Nbextensions**tab will appear. Mark the necessary extensions in it:-
**Table of Contents**

- Features (