XOR - The Magic Swap
Written by Mike James & Harry Fairhead   
Saturday, 30 July 2022
Article Index
XOR - The Magic Swap
The Additive Swap

The Additive Swap

Less obvious is trick that still moves data but does so without any intermediate storage. There are a number of versions of the method, the simplest being based on arithmetic.

The key to the idea is to notice that if you add the value in B to A then you can still recover the original value in A by subtracting B. If you add to this the idea that you can also get B by subtracting A then you have the start of an idea. 

Using this fact it isn't difficult to see that you can swap the contents of A and B by a three stage process:

Instruction Value in A     Value in B
               A              B
A=A+B         A+B             B
B=A-B         A+B             A
A=A-B          B              A

This really does fulfill the beginners intuition that if you have two bowls of cereal at the start and two at the end you should be able to swap them over using just two bowls.

Notice that what is going on is the fact that as you proceed with the swap you make sure that both values are recoverable at every step. When you add B to A you still have B so you can undo the process by subtracting B. At the second step you can get back B by subtracting A. At no point have you lost the values of A and B just mixed them up in a reversible way.

Ah if only it was possible with cereal. 

This still looks like magic; is there any drawback to the method?

The answer is that it can all go wrong if the value of A+B is larger than a variable can hold. In a sense we are making use of the full range of values that a variable can store to hold A+B in a single variable and using -B or -A as decoding keys to get the original values back.

The cereal bowl equivalent is tipping all the conents of B into A and then having a magic way of separating them out again. Even if you did have a magic way of separating them this still only works as long as bowl A is big enough to hold both cereals.

This is a serious limitation.

However, if you replace the operation of addition with any operation that combines the two values so that they can still be separated using the original values then you have another swapping method.

The XOR Swap

The operation with most promise is the exclusive OR operation - often denoted as XOR. The A XOR B operation takes each bit in the binary representation of A and combines it with the corresponding bit of B to give a result according to the following table:

 A B  A XOR B
 0 0     0
 0 1     1
 1 0     1
 1 1     0

Many languages have bitwise exclusive OR operations. In JavaScript, C  and C# it is written A^B.

The interesting thing about A XOR B is that

(A XOR B) XOR A

is B and

(A XOR B) XOR B

is A - which should be compared to A+B-A and A+B-B.

A simpler way of saying this is that 

A XOR (A XOR B)=B

and XOR is its own inverse.

XOR a value with something and you get a new value perform the same XOR again and you get back the value you started with.

This all means that XOR behaves just like addition, and like subtraction as well, you should be able to see how it can be used in the same three-stage swap routine given earlier. That is,

Instruction  Value in A  Value in B
                 A           B
A=A XOR B     A XOR B        B
B=A XOR B     A XOR B        A
A=A XOR B        B           A

and as you can't get an overflow from XORing two values together the method is foolproof and somehow seems to suggest that you can get a quart into a pint pot by storing two values in a single variable at the same time!

Of course the trick is that you are never storing two values in a single variable you are storing two values in two variables at all times. It is just that for some of the times the values are stored in a mixed up coding. 

I am certainly not suggesting that the triple XOR method is the one to use every time you want to swap two variables but it has its own fascination. In the dark old days of punched cards and flashing lights every assembly language programmer, i.e. every programmer, knew the trick because it saved having to use another memory location. Today many compilers perform the XOR swap as an optimization when they detect a variable swap using a temporary variable. The overall result is that even though the technique is still in use - programmers don't get to see it. 

Shame to let a nice idea like that die out.

The XOR Hash

The XOR "trick" of seeming to be able to store two values in one variable can also be used in other situations.

Suppose you have two values A and B and you want to return A when you know B and B when you know A. Then the usual way of doing this is to store both values and return the one you don't know. 

A simpler solution is to store

D= A XOR B

now when you are presented with a value X which is either A or B you return

X XOR D

which is B if X is A and A if X is B.

The best known example of where this can be used is when you want to build a doubly linked list. in this case each node in the list has to hold two references - one to the left node and one to the right node. Usually you would use two variables to store the references. Now you can see that you only need one. 

Pointer= Left XOR Right

Now if you are moving along the linked list from the start to the end you always have the pointer to the Left element - its where you have just come from. So 

Next = Pointer XOR Left

gives you the next node to move to on the right. 

If you are moving along the linked list from end to start then you always have the pointer to the Right element as it is just the node you have just come from. So 

Next = Pointer XOR Right

How worth while this sort of implementation is debatable. It saves one unit of storage per node and many programming languages don't allow this sort of manipulation with a reference - it being too low level. 

It is another fun use of the basic properties of the magic XOR swap. 

Are there more - yes in cryptography but that really is another story.

cereal

<ASIN:1871962609>

 <ASIN:1871962617>

<ASIN:1871962455>

<ASIN:1871962463> 

Related Articles

 

What Programmers Know

knowcover

Contents

  1. The Computer - What's The Big Idea?*
  2. The Memory Principle - Computer Memory and Pigeonholes*
  3. Principles of Execution - The CPU
  4. The Essence Of Programming
  5. Variables - Scope, Lifetime And More*
  6. Binary Arithmetic
  7. Hexadecimal*
  8. Binary - Negative Numbers*
  9. Floating Point Numbers*
  10. Inside the Computer - Addressing
  11. The Mod Function
  12. Recursion
  13. The Lost Art Of The Storage Mapping Function *
  14. Hashing - The Greatest Idea In Programming
  15. Advanced Hashing
  16. XOR - The Magic Swap*
  17. Programmer's Introduction to XML
  18. From Data To Objects*
  19. What Exactly Is A First Class Function - And Why You Should Care*
  20. Stacks And Trees*
  21. The LIFO Stack - A Gentle Guide*
  22. Data Structures - Trees
  23. Inside Random Numbers
  24. The Monte Carlo Method
  25. Cache Memory And The Caching Principle
  26. Data Compression The Dictionary Way
  27. Dates Are Difficult*
  28. Sequential Storage*
  29. Magic of Merging*
  30. Power of Operators
  31. The Heart Of A Compiler*
  32. The Fundamentals of Pointers
  33. Functional And Dysfunctional Programming*

* Recently revised

raspberry pi books

 

Comments




or email your comment to: comments@i-programmer.info

To be informed about new articles on I Programmer, sign up for our weekly newsletter, subscribe to the RSS feed and follow us on Twitter, Facebook or Linkedin.

Banner


Inside Bitcoin - The Block Chain

Bitcoin is a currency that exists entirely in software and is under the control of no central authority. What is really important about Bitcoin, however, are the algorithms that make it all work. We e [ ... ]



Power of Operators

This article  is more or less everything that the working programmer should know about operators and their associated expressions and, of course, the use of parentheses.


Other Articles


 



Last Updated ( Saturday, 30 July 2022 )