Skip

POW 1UP
March 27, 2010 5:19 AM   Subscribe

Is there a closed-form method for finding the unique subsets from the power set of prime factors of an integer?

Let's say I start with the integer 28, with prime factors {2, 2, 7}.

If I construct a power set from this set of factors, I get:

{{nil}, {2}, {2}, {7}, {2,2}, {2,7}, {2, 7}, {2, 2, 7}}

I can reduce this to a set of divisors by multiplying the elements inside the non-empty subsets, to get {2, 7, 4, 14, 28} (adding {1} as a default divisor).

However, two of the subsets are duplicates: {2} and {2,7}, so I am doing unnecessary calculations by simply iterating through all subsets.

I could loop through the subsets and keep a hash table of multiplied elements-as-keys or some similar brute force approach. But as a power set grows quickly, there could be a very large number of subsets.

Is there a smarter, more general way to do this, if I know what the labels (prime factors) are, something like a unique permutation of the factors, where some factors might be repeated?

(Note: This isn't homework.)
posted by Blazecock Pileon to Science & Nature (9 answers total) 1 user marked this as a favorite
 
Let's say your number has the factorization p_1^(e_1) p_2^(e_2) ... p_m^(e_m), where the p_i are distinct primes. For example, 28 = 2^2 7^1.

Then you want p_1^(f_1) ... p_m^(f_m) for each m-tuple (f_1, ..., f_m) where f_k is between 0 and e_k (inclusive).
posted by madcaptenor at 5:25 AM on March 27, 2010 [3 favorites]


two of the subsets are duplicates: {2} and {2,7}, so I am doing unnecessary calculations by simply iterating through all subsets.

If you are actually representing these using some kind of set data structure, duplicates should be removed for you automatically. By definition sets don't contain duplicates (e.g. {{0}, {0}} is the same set as {{0}}). However, if you are using a multiset, it won't do this for you. If you are rolling your own data structure, maybe the thing to do would be to see how a standard implementation of a set would do this. (Or just use one.)
posted by advil at 6:20 AM on March 27, 2010 [2 favorites]


Exactly what madcaptenor said: write out all the unique prime factors and how often each one occurs. Then, every element of the factor power set can be written in terms of taking f_1 of the first factor, f_2 of the 2nd factor, etc...

On that note, if there are factors p_1, p_2, ... p_m with counts e_1, e_2, ... , e_m, the number has (1+e_1) * (1+e_2) ... * (1+e_m) factors including 1 and itself.
posted by bsdfish at 8:07 AM on March 27, 2010


(Note: This isn't homework.)

I'm pretty sure it's Euly, though.

Anyway, I agree with the above: Keep track of unique primes and the powers thereof.
posted by DU at 8:43 AM on March 27, 2010


If you are actually representing these using some kind of set data structure, duplicates should be removed for you automatically.

That would be language or implementation dependent, and a kind of a cheat. My hope was to find a method for generating a power set that automatically removes duplicates, without relying on the vagaries of a particular language. But instead of using a power set, it seems better to just use an ordered tuple approach, once I have the factorization.
posted by Blazecock Pileon at 4:48 PM on March 27, 2010


Then you want p_1^(f_1) ... p_m^(f_m) for each m-tuple (f_1, ..., f_m) where f_k is between 0 and e_k (inclusive).

Isn't that also just the list of factors?
posted by albrecht at 7:38 PM on March 27, 2010


albrecht, I thought the list of factors is what Blazecock was asking for.
posted by madcaptenor at 6:53 AM on March 28, 2010


Oh, sorry; I misunderstood the question. In that case, your answer is definitely the way to go.
posted by albrecht at 7:35 AM on March 28, 2010


No, I have the factors. I was trying to build a count of divisors. Using a power set will introduce duplicates. So I just need to generate permutations of factors a different way.
posted by Blazecock Pileon at 3:25 AM on March 30, 2010


« Older Are there DIY methods to stren...   |  Back in January, as a New Year... Newer »
This thread is closed to new comments.


Post