Largest Sub-Square in a Binary Array

Based on the “largest square in a histogram” problem (shown below), the goal is to analyze a binary 2D array consisting of ones and zeroes, and find the largest square section of the array that contains ones. This is useful for a number of tasks in statistics and image processing.

Largest rectangle in a histogram. (C) GeeksForGeeks.com – Shown under a FAIR USE exemption to the DMCA for works relating to public knowledge.

 

Using a “waterfall” incrementing operation, we arrive at the code shown below.

 

 

The best part about this, is by manipulating the “bias” variable, we can control the average number of ones versus zeros in the array, allowing us to see a number of different potential outcomes. With an “even” bias (ie: 50%), we see that the largest sub-square is often quite small.

 

While with a modified bias (80% in the case below) we get much larger sub-squares.