color_tools.image.analysis

Image analysis functions for color extraction and manipulation.

This module provides functions for extracting colors from images and redistributing their luminance values for perceptual uniformity.

Requires Pillow (PIL) - install with: pip install -r requirements-image.txt

class color_tools.image.analysis.ColorCluster(centroid_rgb, centroid_lab, pixel_indices, pixel_count)[source]

Bases: object

A cluster of similar colors from k-means clustering in LAB color space.

Represents a group of perceptually similar pixels extracted from an image. The centroid is the representative color for the cluster, and pixel assignments enable remapping the original image to use only the dominant colors.

Variables:
  • centroid_rgb – Representative RGB color for this cluster (0-255 each)

  • centroid_lab – Representative color in CIE LAB space (L: 0-100, a/b: ~-128 to +127)

  • pixel_indices – List of pixel indices (flat array positions) belonging to this cluster

  • pixel_count – Number of pixels in this cluster (dominance weight)

Parameters:

Example

>>> from color_tools.image import extract_color_clusters
>>> clusters = extract_color_clusters("photo.jpg", n_colors=5)
>>> for i, cluster in enumerate(clusters, 1):
...     print(f"Color {i}: RGB{cluster.centroid_rgb} ({cluster.pixel_count} pixels)")
Color 1: RGB(45, 52, 71) (15234 pixels)
Color 2: RGB(189, 147, 128) (8921 pixels)
centroid_rgb: Tuple[int, int, int]
centroid_lab: Tuple[float, float, float]
pixel_indices: List[int]
pixel_count: int
__str__()[source]

Human-readable cluster representation: RGB color with pixel count.

Return type:

str

color_tools.image.analysis.l_value_to_hueforge_layer(l_value, total_layers=27)[source]

Convert an LCH L value (0-100) to a Hueforge layer number.

Parameters:
  • l_value (float) – LCH L value (0-100), as returned by rgb_to_lch(). 0 = black, 100 = white. This is perceptual lightness, not HSL lightness.

  • total_layers (int) – Total layers in Hueforge (default: 27)

Return type:

int

Returns:

Layer number (1-based, from 1 to total_layers)

Example

>>> l_value_to_hueforge_layer(0.0)    # Darkest
1
>>> l_value_to_hueforge_layer(33.3)   # 1/3 up
10
>>> l_value_to_hueforge_layer(100.0)  # Brightest
27
class color_tools.image.analysis.ColorChange(original_rgb, original_lch, new_rgb, new_lch, delta_e, hueforge_layer)[source]

Bases: object

Represents a color transformation from luminance redistribution for HueForge optimization.

Tracks the before/after state when redistributing luminance values evenly across a set of colors. This is used for HueForge 3D printing to spread colors across the 27 available layers and prevent multiple colors from bunching on the same layer.

Variables:
  • original_rgb – Original RGB color before redistribution (0-255 each)

  • original_lch – Original LCH color (L: 0-100, C: 0-100+, H: 0-360°)

  • new_rgb – New RGB color after luminance redistribution (0-255 each)

  • new_lch – New LCH color with redistributed L value (L: 0-100, C: 0-100+, H: 0-360°)

  • delta_e – Perceptual color difference (Delta E 2000) between original and new

  • hueforge_layer – Target HueForge layer number (1-27) based on new L value

Parameters:

Example

>>> from color_tools.image import extract_color_clusters, redistribute_luminance
>>> clusters = extract_color_clusters("image.jpg", n_colors=10)
>>> colors = [c.centroid_rgb for c in clusters]
>>> changes = redistribute_luminance(colors)
>>> for change in changes:
...     print(f"Layer {change.hueforge_layer}: RGB{change.new_rgb} (ΔE: {change.delta_e:.1f})")
Layer 3: RGB(45, 52, 71) (ΔE: 12.3)
Layer 7: RGB(89, 95, 102) (ΔE: 18.7)
original_rgb: Tuple[int, int, int]
original_lch: Tuple[float, float, float]
new_rgb: Tuple[int, int, int]
new_lch: Tuple[float, float, float]
delta_e: float
hueforge_layer: int
__str__()[source]

Human-readable color change: layer assignment with delta E.

Return type:

str

color_tools.image.analysis.extract_color_clusters(image_path, n_colors=10, use_lab_distance=True, *, distance_metric='lab', l_weight=1.0, use_l_median=False, n_iter=10)[source]

Extract color clusters from an image using k-means clustering.

This uses k-means in LAB color space for perceptually uniform clustering. Returns full cluster data including pixel assignments for later remapping.

Parameters:
  • image_path (str) – Path to the image file.

  • n_colors (int) – Number of clusters to extract (default: 10).

  • use_lab_distance (bool) – Deprecated — use distance_metric instead. When True (default) and distance_metric is not set, uses LAB Euclidean distance. When False, uses raw RGB distance.

  • distance_metric (str) –

    Distance metric for cluster assignment.

    • "lab" — squared Euclidean in CIE LAB space (default)

    • "rgb" — squared Euclidean in sRGB space

    • "hyab" — HyAB (hybrid L + chromatic) in LAB space; recommended with l_weight=2.0 for image quantization

  • l_weight (float) – Lightness weight for HyAB distance (default: 1.0). A value of 2.0 (quantize_image_hyab default) emphasises lightness differences, yielding better separation of shades. Ignored unless distance_metric="hyab".

  • use_l_median (bool) – When True, use the median of the L channel (and mean of a/b) when updating centroids. This makes dark and light perceptual groups more stable. Ignored when distance_metric="rgb".

  • n_iter (int) – Number of k-means iterations (default: 10).

Return type:

List[ColorCluster]

Returns:

List of ColorCluster objects with centroids and pixel assignments, sorted by pixel_count descending.

Raises:

Example:

>>> clusters = extract_color_clusters("photo.jpg", n_colors=8)
>>> for cluster in clusters:
...     print(f"Color: {cluster.centroid_rgb}, Pixels: {cluster.pixel_count}")
Color: (255, 0, 0), Pixels: 1523
Color: (0, 128, 255), Pixels: 892

HyAB k-means example:

>>> clusters = extract_color_clusters(
...     "photo.jpg",
...     n_colors=16,
...     distance_metric="hyab",
...     l_weight=2.0,
...     use_l_median=True,
... )
color_tools.image.analysis.extract_unique_colors(image_path, n_colors=10)[source]

Extract unique colors from an image using k-means clustering.

This is a simplified wrapper around extract_color_clusters that just returns the centroid RGB values for backward compatibility.

Parameters:
  • image_path (str) – Path to the image file

  • n_colors (int) – Number of unique colors to extract (default: 10)

Return type:

List[Tuple[int, int, int]]

Returns:

List of RGB tuples (0-255 for each component)

Raises:

Example

>>> colors = extract_unique_colors("photo.jpg", n_colors=8)
>>> print(colors)
[(255, 0, 0), (0, 128, 255), ...]
color_tools.image.analysis.quantize_image_hyab(image_path, n_colors=16, *, n_iter=10, l_weight=2.0, use_l_median=True)[source]

Quantize an image to n_colors using HyAB k-means clustering.

HyAB uses hybrid L + chromatic distance in CIE LAB space, which often produces better separation of light and dark tones than pure Euclidean LAB distance. The default l_weight=2.0 is the value recommended by Abasi et al. (2020) for image quantization tasks.

Steps:

  1. Run k-means with HyAB distance to find cluster centroids.

  2. Map every pixel to its nearest centroid colour.

  3. Return the quantized image as a PIL.Image.Image.

Parameters:
  • image_path (str) – Path to the input image file.

  • n_colors (int) – Palette size — number of distinct colours in the output (default: 16).

  • n_iter (int) – Number of k-means iterations (default: 10).

  • l_weight (float) – Lightness weight for HyAB distance (default: 2.0). Higher values emphasise lightness differences.

  • use_l_median (bool) – Use the median (not mean) of the L channel when updating centroids (default: True). Improves stability of dark/light clusters.

Return type:

Image

Returns:

Quantized PIL.Image.Image in RGB mode.

Raises:

Example:

>>> img = quantize_image_hyab("photo.jpg", n_colors=8)
>>> img.save("quantized.png")

See also

extract_color_clusters() — lower-level function that returns cluster data instead of a rendered image.

color_tools.image.analysis.redistribute_luminance(colors)[source]

Redistribute LCH lightness values evenly across a list of colors.

This function: 1. Converts colors to LCH space 2. Sorts by LCH L (lightness) value 3. Redistributes L values evenly between 0 and 100 4. Converts back to RGB 5. Calculates Delta E for each change

Parameters:

colors (List[Tuple[int, int, int]]) – List of RGB tuples to redistribute

Return type:

List[ColorChange]

Returns:

List of ColorChange objects showing before/after for each color

Example

>>> colors = [(100, 50, 30), (200, 180, 160), (50, 50, 50)]
>>> changes = redistribute_luminance(colors)
>>> for change in changes:
...     print(f"L: {change.original_lch[0]:.1f} -> {change.new_lch[0]:.1f}, ΔE={change.delta_e:.2f}")
L: 24.3 -> 0.0, ΔE=12.45
L: 53.2 -> 50.0, ΔE=3.21
L: 76.8 -> 100.0, ΔE=23.14
color_tools.image.analysis.format_color_change_report(changes)[source]

Format a human-readable report of color changes.

Parameters:

changes (List[ColorChange]) – List of ColorChange objects

Return type:

str

Returns:

Formatted string showing before/after for each color

Example

>>> changes = redistribute_luminance([(100, 50, 30), (200, 180, 160)])
>>> print(format_color_change_report(changes))
Color Luminance Redistribution Report
=====================================
  1. RGB(100, 50, 30) → RGB(98, 48, 28) L: 24.3 → 33.3 | C: 28.5 → 28.5 | H: 31.2 → 31.2 ΔE (CIEDE2000): 9.12