Program to count how many ways we can cut the matrix into k pieces in python


Suppose we have a binary matrix and another value k. You want to split the matrix into k pieces such that each piece contains at least one 1 in it. But there are some rules for cutting, we have to follow in order: 1. Select a direction: vertical or horizontal 2. Select an index in the matrix to cut into two sections. 3. If we cut vertically, we can no longer cut the left part but can only continue cutting the right part.  4. If we cut horizontally, we can no longer cut the top part and can only continue cutting the bottom part. So we have to find the number of different ways there are to divide the matrix. If the answer is very large, return result mod (10^9 + 7).

So, if the input is like

1
1
0
1
0
1
1
1
1

k = 2, then the output will be 4, as we can cut vertically twice and horizontally twice.

To solve this, we will follow these steps −

  • p := 10^9 + 7
  • m := row count of matrix, n := column count of matrix
  • counts := an empty map
  • for i in range m - 1 to 0, do
    • for j in range n - 1 to 0, do
      • counts[i, j] := counts[i + 1, j] + counts[(i, j + 1) ] - counts[(i + 1, j + 1) ] + matrix[i, j]
  • Define a function f() . This will take x, y, c
  • count := counts[x, y]
  • if c is same as 0, then
    • return 1 when count > 0 otherwise 0
  • ans := 0
  • for i in range x + 1 to m - 1, do
    • if 0 < counts[(i, y)] < count, then
      • ans := ans + f(i, y, c - 1)
  • for j in range y + 1 to n - 1, do
    • if 0 < counts[(x, j)] < count, then
      • ans := ans + f(x, j, c - 1)
  • return ans mod p
  • From the main method call and return f(0, 0, k - 1)

Let us see the following implementation to get better understanding −

Example 

Live Demo

from collections import defaultdict
class Solution:
   def solve(self, matrix, k):
      p = 10 ** 9 + 7

      m, n = len(matrix), len(matrix[0])
      counts = defaultdict(int)
      for i in range(m)[::-1]:
         for j in range(n)[::-1]:
            counts[(i, j)] = (counts[(i + 1, j)] + counts[(i, j + 1)] - counts[(i + 1, j + 1)] + matrix[i][j])

      def f(x, y, c):
         count = counts[(x, y)]
         if c == 0:
            return 1 if count > 0 else 0

         ans = 0
         for i in range(x + 1, m):
            if 0 < counts[(i, y)] < count:
               ans += f(i, y, c - 1)
         for j in range(y + 1, n):
            if 0 < counts[(x, j)] < count:
               ans += f(x, j, c - 1)

         return ans % p
      return f(0, 0, k - 1)
     
ob = Solution()
matrix = [
   [1, 1, 0],
   [1, 0, 1],
   [1, 1, 1],
]
k = 2
print(ob.solve(matrix, k))

Input

[  
[1, 1, 0],  
[1, 0, 1],  
[1, 1, 1]
], 2

Output

4

Updated on: 03-Dec-2020

153 Views

Kickstart Your Career

Get certified by completing the course

Get Started
Advertisements