Friday, August 29, 2025

Machine Learning: Simple Image Compression using Singular Value Decomposition (SVD)


Singular Value Decomposition (SVD) is a matrix factorization technique commonly used in machine learning for dimensionality reduction, data compression, and feature extraction. It decomposes a matrix into three simpler matrices, revealing underlying patterns in the data. 


As seen from the above, the original picture on the left is compressed to the picture on the right. From the picture below, we can see that the file size of the image 'compressed_dog' is just 59kb, compare to 'dog' of 173kb.







Python Code by Grok

import numpy as np

from PIL import Image

import matplotlib.pyplot as plt


# Step 1: Load the color image from specified directory

image_path = "C:/Users/Ee Leen/dog.jpg"  # Image path

image = Image.open(image_path)  # Load color image

image_array = np.array(image)  # Shape: (height, width, 3)


# Step 2: Separate RGB channels

r = image_array[:, :, 0]  # Red channel

g = image_array[:, :, 1]  # Green channel

b = image_array[:, :, 2]  # Blue channel


# Step 3: Apply SVD to each channel

k = 20  # Changed from 50 to 20

Ur, sigmar, Vtr = np.linalg.svd(r, full_matrices=False)

Ug, sigmag, Vtg = np.linalg.svd(g, full_matrices=False)

Ub, sigmab, Vtb = np.linalg.svd(b, full_matrices=False)


# Step 4: Reduce dimensions for each channel

Ur_k = Ur[:, :k]

sigmar_k = np.diag(sigmar[:k])

Vtr_k = Vtr[:k, :]


Ug_k = Ug[:, :k]

sigmag_k = np.diag(sigmag[:k])

Vtg_k = Vtg[:k, :]


Ub_k = Ub[:, :k]

sigmab_k = np.diag(sigmab[:k])

Vtb_k = Vtb[:k, :]


# Step 5: Reconstruct each channel

r_compressed = np.dot(Ur_k, np.dot(sigmar_k, Vtr_k))

g_compressed = np.dot(Ug_k, np.dot(sigmag_k, Vtg_k))

b_compressed = np.dot(Ub_k, np.dot(sigmab_k, Vtb_k))


# Step 6: Combine channels into a single image

compressed_image = np.stack([r_compressed, g_compressed, b_compressed], axis=2)

compressed_image = np.clip(compressed_image, 0, 255).astype(np.uint8)


# Step 7: Display original and compressed images

plt.figure(figsize=(10, 5))

plt.subplot(1, 2, 1)

plt.title("Original Image")

plt.imshow(image_array)

plt.axis("off")


plt.subplot(1, 2, 2)

plt.title(f"Compressed Image (k={k})")

plt.imshow(compressed_image)

plt.axis("off")

plt.show()


# Step 8: Save compressed image

output_path = "C:/Users/Ee Leen/compressed_dog.jpg"  # Output path

Image.fromarray(compressed_image).save(output_path)






Wednesday, August 27, 2025

Just Own Research: Classification of Data using K-Nearest Neighbor Algorithm ~ Machine Learning

 

As show in the above graph, this test would like to find out whether the test point is nearer to Normal or Anamalous condition. Anomalous readings could trigger alerts or maintenance actions in a factory setting.

Explanation
  1. Synthetic Training Data:
    • Normal Data: 50 points centered at (30°C, 0.5°C/min), representing typical equipment operation (e.g., a motor running normally in a factory).
    • Anomalous Data: 50 points centered at (60°C, 1.0°C/min), indicating overheating or equipment stress.
    • Realism: These values are based on typical industrial temperature sensor ranges (e.g., thermocouples monitoring motors or HVAC systems), where normal temperatures are 20–40°C and anomalous ones exceed 50°C.
  2. Realistic Test Point:
    • Test Point: [42, 0.9] (42°C, 0.9°C/min).
    • Why Realistic:
      • 42°C is just above normal operating temperatures (e.g., a motor typically at 30°C), suggesting early signs of overheating but not yet critical (e.g., >50°C).
      • 0.9°C/min is a moderate rate of change, plausible for equipment under stress (e.g., a cooling system failing), making it an edge case that tests k-NN’s ability to distinguish between normal and anomalous conditions.
      • These values align with real-world factory scenarios where temperature sensors detect gradual failures.
  3. k-NN Classification:
    • Uses Euclidean distance to find the 3 nearest neighbors to the test point in the 2D space (temperature, rate of change).
    • Classifies the test point as “Normal” or “Anomalous” based on majority voting.
    • The test point [42, 0.9] is likely classified as “Normal” but is close to the anomalous cluster, making it a realistic and interesting case.

Python Code by Grok

import numpy as np
import matplotlib.pyplot as plt
from collections import Counter

# Generate synthetic temperature sensor data
np.random.seed(42)
# Normal data: 50 points (temperature in °C, rate of change in °C/min)
normal_data = np.random.normal(loc=[30, 0.5], scale=[5, 0.1], size=(50, 2))  # Centered at (30°C, 0.5°C/min)
# Anomalous data: 50 points (high temperature, higher rate of change)
anomalous_data = np.random.normal(loc=[60, 1.0], scale=[5, 0.1], size=(50, 2))  # Centered at (60°C, 1.0°C/min)
# Combine data and labels
X = np.vstack((normal_data, anomalous_data))
y = np.array([0] * 50 + [1] * 50)  # 0 = Normal, 1 = Anomalous

# Realistic test sensor reading (temperature, rate of change)
test_point = np.array([42, 0.9])  # Realistic: slightly high temp, moderate rate of change

# Euclidean distance function
def euclidean_distance(point1, point2):
    return np.sqrt(np.sum((point1 - point2) ** 2))

# k-NN classifier
def knn_predict(X_train, y_train, test_point, k=3):
    distances = [(euclidean_distance(test_point, x), y) for x, y in zip(X_train, y_train)]
    distances.sort()
    k_nearest_labels = [label for _, label in distances[:k]]
    return Counter(k_nearest_labels).most_common(1)[0][0]

# Classify the test point
k = 3
predicted_class = knn_predict(X, y, test_point, k)
class_name = "Normal" if predicted_class == 0 else "Anomalous"
print(f"Test temperature reading {test_point} is classified as: {class_name}")

# Plot 1: Temperature sensor data before classification
plt.figure(figsize=(8, 6))
plt.scatter(normal_data[:, 0], normal_data[:, 1], c='blue', label='Normal', alpha=0.6)
plt.scatter(anomalous_data[:, 0], anomalous_data[:, 1], c='red', label='Anomalous', alpha=0.6)
plt.xlabel('Temperature (°C)')
plt.ylabel('Rate of Change (°C/min)')
plt.title('Temperature Sensor Data Before Classification')
plt.legend()
plt.grid(True)
plt.show()

# Plot 2: Temperature sensor data with classification result
plt.figure(figsize=(8, 6))
plt.scatter(normal_data[:, 0], normal_data[:, 1], c='blue', label='Normal', alpha=0.6)
plt.scatter(anomalous_data[:, 0], anomalous_data[:, 1], c='red', label='Anomalous', alpha=0.6)
plt.scatter(test_point[0], test_point[1], c='green', marker='*', s=200, label=f'Test Point ({class_name})')
plt.xlabel('Temperature (°C)')
plt.ylabel('Rate of Change (°C/min)')
plt.title('Temperature Sensor Data Classification with k-NN (k=3)')
plt.legend()
plt.grid(True)
plt.show()

Wednesday, August 20, 2025

Just Research: Trajectory Planning Of A Factory Robot

 Trajectory Planning Of A Factory Robot 

Trajectory planning is a critical aspect of robotics, particularly in the realm of industrial robots. It involves determining the path that a robot's end effector will take to move from one point to another while considering various constraints.

References: 

  1. https://eureka.patsnap.com/article/what-is-trajectory-planning-in-industrial-robots
  2. Graph plot using python code generated by Grok.

Tuesday, August 19, 2025

Control Chart For Quality Control Procedures

Words to Grok

Can you generate a python code for the below, the code construct a control chart for sample means to help monitor the process.

Tan plc in its quality control procedures sets the warning limit at 0.025 and the action limit at the 0.001 probability point. This means, for example, that the upper action line is set so that the probability of the mean exceeding the line is 0.001. The company produce steel ingots with a mean weight of 6kg and known standard deviation of 800 grams. The output is monitored by taking random samples of four ingots at regular intervals during the production process. 



Sunday, August 17, 2025

Example of Simple (Pretrain AI) Object Detection On Photo

I type the below in Grok.

Can you generate a python code to load the photo dog.jpg from my computer c drive, then circle the dog in the picture?


 
Figure1: Dog is detected and circled.

As above, the dog is detected and circled. Object detection is one of the applications of AI. 

Pretrain AI modal YOLO is used in this run. Yolo is built on deep learning and computer vision. YOLO perform detection in a single forward pass through a convolutional neural network (CNN). 

A convolutional neural network (CNN) is a type of feedforward neural network that learns features via filter (or kernel) optimization. This type of deep learning network has been applied to process and make predictions from many different types of data including text, images and audio.


References: 

1. https://en.wikipedia.org/wiki/Convolutional_neural_network

Saturday, August 16, 2025

Example of simple AI(Artificial Neural Network) Application On ECG Data

I think AI can be applied in signal processing to process data such as ECG and gait data.

The words I type in Grok to generate the graphs are as below: 

Can you generate python code to draw a graph of raw ECG data and another graph where artificial neural network is perform on the raw data to recognize the real pattern of the ECG data, label the graph

Explanation from Grok
I'll provide Python code to visualize raw ECG data and apply a simple artificial neural network (ANN) to process the data, attempting to recognize patterns (e.g., denoising or feature enhancement). The ANN will be basic, as ECG pattern recognition typically requires domain-specific preprocessing and labeled data for tasks like QRS detection or arrhythmia classification. Since no specific ECG dataset is provided, I'll generate synthetic ECG-like data for demonstration. The code will include two labeled plots: one for raw ECG data and one for the ANN-processed output.Assumptions
  • We'll use numpy to generate synthetic ECG data mimicking a periodic signal with P, QRS, and T waves.
  • The ANN will be a simple feedforward neural network (using tensorflow/keras) trained to denoise or reconstruct the ECG signal.
  • We'll add noise to the raw data to simulate real-world conditions and train the ANN to recover the clean signal.
  • Libraries required: numpy, matplotlib, tensorflow.



Thursday, August 14, 2025

Just Own Research: Non Linear ODE, Phase Diagram and Steady State Analysis

Many nonlinear differential equations cannot be solved explicitly as functions of time. Phase diagrams, however, offer qualitative information about the stability of equations that is helpful in determining whether the equations will converge to an intertemporal (steady-state) equilibrium or not.

If the arrows of motion point towards a steady-state solution, the solution is stable; if the arrows of motion point away from a steady-state solution, the solution is unstable.

The stability of the steady-state equilibrium points can now be read from the graph. Since the arrows of motion point away from the first intertemporal equilibrium  is an unstable equilibrium. With the arrows of motion pointing toward the second intertemporal equilibrium  is a stable equilibrium.

References:

Pena-Levano, Luis Moises. Schaum's Outline of Calculus for Business, Economics and Finance, Fourth Edition (p. 368). McGraw Hill LLC. Kindle Edition. 

Python code for graph generated using Grok.

Wednesday, August 13, 2025

Just Hobby: Ordinary Differential Equation of Simple Pressure Sensor Dynamic Control System ~ By GROK

 Ordinary Differential Equation of a simple dynamic control system

The ODE for the simple pressure sensor, with voltage output ( V(t) ) and input pressure ( P(t) ), is:



This is plot using python, code generated by Grok.