๐Ÿค– Python examples of popular machine learning algorithms with interactive Jupyter demos and math being explained
MIT License

Let me know that how can I improve skill in MLE. And if possible then show an easy to understand Roadmap.

Russian translation would be appreciated very much by Russian pyhton community

Hello How are you?

Its my eager request to you that mine website http://www.windowsmoviemaker.xyz/ is not getting redirect to SSl https://www.windowsmoviemaker.xyz/ ? I have installed Really simple SSl but not able to get the issue solved till now?
This is my website : Windows Movie maker

Is there no module such that you could make predictions on new data for K-Means?

It is a Kaggle Guide Document for someone who is new at Kaggle!

We exchanged tweets about adding semi-supervised learning in the map. It was recommended that I add an issue.

Install in a new venv, install requirements, then attempt to run k-means. Pandas is not installed and cannot be included.

Install in a new venv, install requirements, then attempt to run k-means. Pandas is not installed and cannot be included.

error LogisticRegression

ModuleNotFoundError Traceback (most recent call last)
6 # Import custom logistic regression implementation.
----> 7 from homemade.logistic_regression import LogisticRegression

D:\Projetos\Teste Fabio\homemade-machine-learning-master\homemade\logistic_regression_init_.py in
1 """Logistic Regression Module"""
----> 3 from .logistic_regression import LogisticRegression

D:\Projetos\Teste Fabio\homemade-machine-learning-master\homemade\logistic_regression\logistic_regression.py in
3 import numpy as np
----> 4 from scipy.optimize import minimize
5 from ..utils.features import prepare_for_training
6 from ..utils.hypothesis import sigmoid

D:\aplicativos\Anaconda3\lib\site-packages\scipy\optimize_init_.py in
386 from .optimize import *
--> 387 from ._minimize import *
388 from ._root import *
389 from ._root_scalar import *

D:\aplicativos\Anaconda3\lib\site-packages\scipy\optimize_minimize.py in
28 from ._trustregion_krylov import _minimize_trust_krylov
29 from ._trustregion_exact import _minimize_trustregion_exact
---> 30 from ._trustregion_constr import _minimize_trustregion_constr
32 # constrained minimization

D:\aplicativos\Anaconda3\lib\site-packages\scipy\optimize_trustregion_constr_init_.py in
----> 4 from .minimize_trustregion_constr import _minimize_trustregion_constr
6 all = ['_minimize_trustregion_constr']

D:\aplicativos\Anaconda3\lib\site-packages\scipy\optimize_trustregion_constr\minimize_trustregion_constr.py in
2 import time
3 import numpy as np
----> 4 from scipy.sparse.linalg import LinearOperator
5 from .._differentiable_functions import VectorFunction
6 from .._constraints import (

D:\aplicativos\Anaconda3\lib\site-packages\scipy\sparse_init_.py in
228 import warnings as _warnings
--> 230 from .base import *
231 from .csr import *
232 from .csc import *

D:\aplicativos\Anaconda3\lib\site-packages\scipy\sparse\base.py in
8 from scipy._lib.six import xrange
----> 9 from scipy._lib._numpy_compat import broadcast_to
10 from .sputils import (isdense, isscalarlike, isintlike,
11 get_sum_dtype, validateaxis, check_reshape_kwargs,

ModuleNotFoundError: No module named 'scipy._lib._numpy_compat'

It may be good to provide pure Python implementation of Gradient Descent (instead of SciPy one) for Logistic Regression just for the learning purposes.

I like your Machine Learning Map (homemade-machine-learning/images/machine-learning-map.png). But could you please complete it a little bit ?
For example, it would be great if you could add NLP, make connections between DQN and the Deep Learning rectangle, include, maybe, some Big Data technologies (Map Reduce for instance), and Planning as well. Monte Carlo Tree search could also appear.
Great job ! ๐Ÿ‘ ๐Ÿ‘

Vectorized version of gradient descent.

theta = theta * reg_param - alpha * (1 / num_examples) * (delta.T @ self.data).T

We should NOT regularize the parameter theta_zero.

theta[0] = theta[0] - alpha * (1 / num_examples) * (self.data[:, 0].T @ delta).T

the first code line ,theta include theta[0].
so I think can write like this:
theta[0] -= alpha * (1 / num_examples) * (self.data[:, 0].T @ delta)
theta[1:] = theta[1:] * reg_param - alpha * (1 / num_examples) * (self.data[:, 1:].T @ delta)


sorry, i opened a issue to another project, sorry to make a mistake in your project.

use y = 4 + 0.5*x to create test data
but there are no right result.

Dear developers,
Thanks for the generous share of code.
When I learn the code of LinearRegression at the step of prepare_for_training, I have understood we need to split the features dataset into first half(dataset_1) and second half(dataset_2). But problems happen when we have odd number of features. let's say 5, so the shape of first half will be (-1,2) and the second (-1,3) which have different shape after split, then how can it work when calculate (dataset_1 ** (i - j)) * (dataset_2 ** j)? I mean dataset_1 ** (i - j) and dataset_2 ** j must have different shape, too.

[I 19:38:56.924 NotebookApp] 302 GET / ( 1.13ms
[I 19:38:56.930 NotebookApp] 302 GET /tree? ( 1.52ms
[W 19:38:56.966 NotebookApp] 403 POST /api/kernels ( '_xsrf' argument missing from POST
[W 19:38:56.967 NotebookApp] '_xsrf' argument missing from POST
[W 19:38:56.967 NotebookApp] 403 POST /api/kernels ( 1.17ms referer=None
[I 19:55:08.294 NotebookApp] 302 GET / (::1) 1.84ms

the above is the error message

I use intelliIdea and cannot connect jupyter notebook