• ## Numerical Analysis: Matrix Singular Value Decomposition and Its Applications (Numpy Implementation)

Time：2022-9-21

1. Singular Value Decomposition (SVD) (1) Singular value decomposition known matrix$$\bm{A} \in \R^{m \times n}$$, and its singular value decomposition is: $\bm{A} = \bm{U}\bm{S}\bm{V}^T$ in$$\bm{U} \in \R^{m \times m}$$，$$\bm{V} \in \R^{n \times n}$$is an orthogonal matrix,$$\bm{S} \in \R^{m \times n}$$is a diagonal matrix.$$\bm{S}$$the diagonal elements of$$\bm{s}_1, \bm{s}_2,…, \bm{s}_{\min(m,n)}$$are the singular values ​​of the matrix. […]

• ## List dimension reduction using numpy

Time：2022-8-7

What python reads from the database is a list similar to a two-dimensional array. Sometimes a dimensionality reduction operation is required. Numpy provides a very useful function that can be used directly. import numpy as np a = np.array([[1, 2], [3, 4], [9, 8]]) #Set source data b = a.flatten()#Reduce dimension to get a numpy.ndarray […]

• ## 22 machine learning open basic course — principal component analysis and clustering

Time：2022-4-26

Principal component analysis and clustering Unsupervised learning is one of the most important machine learning algorithms. Compared with supervised learning algorithm, unsupervised learning algorithm does not need to mark the input data, that is, it does not need to give labels or categories. In addition, unsupervised learning algorithm can also learn the internal relationship of […]

• ## A simple understanding of NMF data typing

Time：2022-4-11

brief introduction NMF, also known as non negative matrix decomposition, can be understood as another dimension reduction clustering method, which is often used in single cell clustering, cancer RNA SEQ data clustering and other applications. The following figure is the schematic diagram of NMF decomposition: Where: V matrix is an n × Matrix of M, […]

• ## SVD dimension reduction by singular value decomposition

Time：2022-4-1

Dimension reduction unsupervised learning method, singular value decomposition SVD dimension reduction is to reduce an arbitrary n through matrix operation × The matrix A of D is decomposed into the result of multiplication of left singular matrix U, singular value matrix Σ and right singular matrix V, and then the dimension reduction of matrix A […]

• ## Course card | knowledge sorting training camp ⑦: understanding note types from the perspective of knowledge granularity

Time：2022-2-25

Hello ~ I’m the little master Yuying who uses [chart card to record notes] Welcome to my course note “understanding note types from the perspective of knowledge granularity” 1、 Recognize notes again 1. Information transmission dimensionFrom speaker to listener, knowledge is dimensionless. From listener to speaker, it is the process of knowledge dimensionality upgrading. 2. […]

• ## [mindspire: let’s learn machine with little Mi!] How to achieve dimensionality reduction?

Time：2022-2-17

I haven’t seen you for a week. I miss you very much! Today, little Mi takes you to learn how to reduce dimension, which is the second type of unsupervised learning problem we encounter! No more nonsense. Let’s start~ 1 dimension reduction exampleFirst, what is dimensionality reduction? This problem should be clarified first. Since data […]

• ## Feature selection, normalization operation (selectkbest, random forest), PCA dimensionality reduction

Time：2022-1-31

##Feature selection, normalization operation (selectkbest, random forest), PCA dimensionality reduction Selectkbest and chi square test, random forest algorithm, dimensionality reduction, normalization operation. **(1) Read data, divided into feature and label value * *. “`python from sklearn.feature_selection import SelectKBest from sklearn.feature_selection import chi2 import pandas as pd content=pd.read_csv(‘dynamic.csv’) x=content. Iloc [:, 0: – 1] ##x is […]

• ## Rust: join, and concat

Time：2021-11-5

Rust’s higher-order functions are used frequently and often encounter “flattening” dimensionality reduction. At this time, the role of join and concat is obvious. Often may be used. 1、 See the official standard library documentation for instructions concat: fn concat(&self) -> Self::Output Flattens a slice of T into a single value Self::Output. Examples assert_eq!([“hello”, “world”].concat(), “helloworld”); […]

• ## The front end should also understand machine learning (Part I)

Time：2021-8-21

Pay attention to the official account of “kite holders” and reply to “data”Get 500g data (available for all “arms”) and professional communication groups, waiting for you to be natural and unrestrained（ha-ha） Background:<br/>In recent years, the popularity of machine learning has been increasing, and the front-end fields are also constantly arranged. From the perspective of large […]

• ## Extension data tecdat: R language PCA (principal component analysis), Ca (correspondence analysis) husband and wife occupational differences and mosaic visualization

Time：2021-8-16

Original link:http://tecdat.cn/?p=22762  Principal component analysis is a commonly used dimensionality reduction algorithm in data mining. It is a multivariable statistical method proposed by Pearson in 1901 and later developed by hotel in 1933. Its main purpose is “dimensionality reduction”. By extracting the largest individual differences shown by principal components, it can also be used to […]

• ## Visual word embedding based on PCA and t-sne

Time：2021-8-13

By Marcellus RubenCompile VKSource: towards Data Science When you hear the words “tea” and “coffee”, what do you think of them? You might say they are all drinks with a certain amount of caffeine. The key is that we can easily recognize that the two words are interrelated. However, when we provide the words “tea” […]