Latest Entries »


“GDP has outlived its usefulness” – Hazel Henderson, Author & Futurist

Measuring GDP is more of an art than a science. It’s really tough to pinpoint the size of a country’s economy and to measure its rate of growth (or contraction during tough times). So experts take their best stab at it and tweak and refine their methodology as they go along. Yet so much rides on GDP: it shapes government policy and moves markets.

From where I sit, the main problem with the current approach to measuring GDP is that it doesn’t differentiate between actions that have a very different set of long-term consequences.  For example, all of the following are considered equal because the price tag is the same:  $1 million dollars spent on repairs after a storm, $1 million to build a jail, $1 million to produce a line of smartphones and $1 million to clear-cut a forest.

All spending goes into the positive category without taking into account the fact that some types of spending exacerbate inequality or deplete natural resources, which are things that can grow into major problems in the future.

With new technology comes the opportunity to harness new ways to get a read on public sentiment, which often goes hand in hand with economic growth or decline. A couple of mathematicians from the University of Vermont’s Complex Systems Center partnered with data specialist Mitre Corp. They analyzed a random sampling of 50 million tweets (in English) a day to track the ups and downs of public contentment using an index they call a hedonometer.

They’ve been doing this since the Fall of 2008, which is when the investment bank Lehman Brothers went belly up. So far, the saddest day on record was April 15, 2013… the day of the Boston Marathon bombings. Christmas and other holidays have consistently emerged as the happiest days.

So these findings aren’t exactly earth-shattering but the hedometer is an attempt at something that has long eluded economists. One of the main criticisms of GDP is that it isn’t able to measure what many consider the most important thing in life: happiness.

You can dismiss it as unscientific and flaky. But there are all kinds of unusual ways to tap into public sentiment out there that are even wackier.  Like the diaper rash indicator. According to Procter & Gamble, American parents spend $1,500 on average per baby on diapers each year and they change a diaper 6.3 times a day. When the economy takes a turn for the worse though, it seems diapers are one of the first costs households cut.

And when diapers are changed less frequently that leads to more instances of diaper rash, which requires parents to pony up for diaper rash cream. Since 2009, the trend in the U.S. has been declining diaper sales and increasing diaper rash cream sales. Infer what you will.

None of these methods are a complete way of assessing economic progress, but combined with traditional methods of measuring GDP what emerges is a snapshot with more nuance and detail. With so much riding on how we quantify GDP, that’s probably a good thing. It’s easy to dismiss these other types of indicators, but to me they give you a real sense of the story behind the numbers.


“It’s interesting, when you look at the predictions made during the peak of the boom in the 1990’s, about e-commerce, or internet traffic, or broadband adoption, or internet advertising, they were all right—they were just wrong in time.” – Chris Anderson, author, entrepreneur & former editor-in-chief of WIRED magazine

Technology was supposed to be this great disruptor… shifting the power from retailers to consumers in a big way. But I have yet to see that revolution really take hold. Maybe it’s on the horizon, but so far, most people haven’t made the connection. The formula is pretty simple: technology = knowledge = power as a shopper.

Oh sure, you’ve probably chosen at least a few online purchases lately instead of buying from a bricks and mortar store. You may be an early adopter and use things like price comparison websites or apps that alert you to the closest place that sells the item you’re looking for at the cheapest price. You may even use your smartphone to store coupons and deals and then use that same device to pay for your purchases. But I believe this is all just the tip of the iceberg.

Consulting firm Capgemini’s latest survey suggests that price will continue to be the single biggest factor for shoppers for the next 5 years. You might think that means price is going to dictate where and how we shop but I would argue that thanks to technology (and a little know-how!) shoppers will be able to dictate to retailers where and how they shop.

Here’s why: shoppers need to be really aware of the battle that is raging between bricks and mortar stores and their online-only rivals. A company like Amazon can offer the same products as a (insert name of competition here) but it can offer rock bottom prices because they don’t have the overhead costs associated with running a physical store.  This has given rise to a phenomenon known as “showrooming” and it has big box stores seeing red. Showrooming means potential buyers walk into a store that has the item they’re interested in on the shelf. Buyers play with or try the item on for size, use the store clerks for their expertise and then walk out and buy said item online for less money.

To combat showrooming, several big box stores now have price-matching policies. That means if you see something advertised somewhere else (in a flyer or online) for less, they’ll offer you the same price if you provide proof of this competing deal. That means that the power is in the hands of the buyer. It just requires a little time to do your research (but you research major purchases anyway, right?) and some preparation too because you’ll need to print out the store’s price matching policy as well as print out proof of the competing deal.

I’m not suggesting we all buy everything from big box stores because they’re offering to compete with online rivals. I firmly believe in supporting mom and pop shops whenever possible to keep them in business. But I also believe that budget-conscious consumers need to realize that technology = knowledge = power. And for whatever reason, many Canadians still aren’t using the tools at their disposal to harness that power.  Savings are a click away…


Tis the season for wish lists. Yup, Santa is making his list and checking it twice and what you find under the tree is supposed to be a reflection of whether you’ve been naughty or nice.

Over the years, my wish lists have slowly but surely shifted to a tech focus. Tech toys were once on the periphery of the list, now they dominate. And it seems pricey tech gadgets are now coveted by kids and grown up kids alike.
I’ve compiled some anecdotal evidence to support my theory. Take for example, my friends’ young son. He’s under the age of 3 and he’s already into tablets. Or another friend’s young nephew, who sent out his wish list as a Word file. I guess a pen-and-paper version just isn’t as easy to copy and distribute. Maybe he likes the fact that you can cram more into a Word file.
This isn’t all that surprising when you look at recent spending trends. According to Statistics Canada, the average amount spent by your typical household on tech communication (cell phone and wireless services… this doesn’t include home entertainment items like a cable bill) was about $1,731 in 2010. Compare that with the year 1990, before the advent of the internet, when we spent roughly $175 on cable and $545 on our phone services (that includes things like answering machines!).

Not surprisingly, what we spend on reading materials like books, magazines, maps and newspapers has been slipping as our tech spend rises. You can blame smartphones, social media, and GPS systems. Even things like online subscriptions and e-readers which were supposed to generate more interest in books and other types of publications are cheaper than their predecessors and (much to the chagrin of many writers) margins are thinner.
So our love of tech toys isn’t exactly tech-flation… because the price of many things have come down. What’s happened is that the tech category now encompasses entertainment, communication, hobbies and side projects, money management and personal organization.
I know that if I could go back in time and talk to myself in the 90′s, I would tell the younger version of myself that in the near future, most everyone I know is going to have a personal “smart” computer/phone/daytimer/camera that fits in the palm of their hand. These devices tell jokes, keep track of everything, connect you to everyone around the world and are a vehicle for something known as the internet, which is a kind virtual library that contains the sum of all human knowledge. The younger version of myself would probably stare at the future version of me and think that sounds too good to be true… and well beyond anything I had hoped for on my Christmas wish list, circa 1990.


Do you remember when we spelled “the internet” with a capital I? Or the days when it was referred to as “the information super highway,” or better yet the “World Wide Web”? It was a time when AOL was still relevant, everyone had a Hotmail account, Google was just a company name not yet a verb, and Ask Jeeves was my go-to for almost everything.
Things have changed a lot in the past few decades. You’ve probably heard people say that before, but to see the changes laid out on a timeline really drives that point home for me. Check it out:
A Short History Of The Net
1969 – The internet is born! It was originally created for the U.S. military to allow a community of computers to share information even though they were far away from each other. This first version of the net was the ARPANET.
1972 – Two guys, Bolt Beranek and Ray Tomlinson invent email. It’s an internal messaging program for ARPANET. Within a year, three quarters of all ARPANET traffic is email.
1978 – The first unsolicited bulk email is sent… a phenomenon we now call spam. It was sent by Gary Thuerk who did marketing for a computer firm called Digital Equipment Corporation (DEC). He sent 393 ARPANET users a message promoting DEC.
1982 – Smiley faces :-) make their first appearance in emails, along with their counterpart the frown-y face :-(
1983 – As more and more people have access to the ARPANET it becomes unsafe for military use so the MILnet, a military-only network, is spawned. It is a separate, though similar entity to the Internet.
1986 – Internet newsgroups are born. The Internet truly begins to unite people with similar interests from different parts of the world.
1988 – The first internet worm is unleashed by Robert Morris. It infects 6,000 computers, clogs up the internet and thousands of dollars worth of computer time is lost forever.
1989 – The World Wide Web is created by CERN, the European organization for nuclear research. Its inception revolutionizes the way information is published and accessed on the internet. Early adopters were university science departments and other scientific institutes.
1993 – Marc Andreesen who is with the U.S. National Center for SuperComputer Applications launches the web-browser Mosaic, which is a huge success and businesses really start paying attention to the web’s potential for making money. Andreesen goes on to develop another web browser (this one you likely heard of!): Netscape.
1995 – Search engine Alta Vista is unleashed. It’s the first to have multilingual search. Now-billionaire Jeff Bezos gives the world which is an online bookseller that pioneers e-commerce. eBay is born this year as well.
1996 – The browser wars unfurl, pitting Microsoft against Netscape. Macromedia Flash 1.0 is born and gets Disney and MSN onboard.
1998 – Google steps onto the scene with its ranked search results and uncluttered design… it’s a standout from its rivals which are loaded up with animated advertisements.
1999 – Napster is unleashed and the music-sharing (some call it “stealing) software lets internet users swap MP3 files. Record labels throw a hissy fit.
2000 – The dot-com bust.
2001 – Napster is shut down.
2004 – Mark Zuckerberg launches Facebook at Harvard University. In just 3 years, it boasts 30 million members. Broadband goes super mainstream and media companies are selling music and video online. Napster rises from the ashes as a paid music download store but its rivals now include Apple’s iTunes. Digital photography explodes and the photo-sharing site Flickr is born. The once-mighty Kodak discontinues re-loadable film cameras in Western Europe and North America.
2005 – Other major industries are threatened by the internet’s rise including traditional TV and telephone companies. YouTube sees the light of day and is promptly bought by Google for nearly $1.7 billion. Skype leads the charge as free internet-based phone calls threaten phone companies. eBay buys Skype for $2.6 billion.
2006 – Micro-blogging site Twitter is born, kick-starting a less-is-more phenomenon built on messages that are 140 characters or less.
2008 – Google turns 10. The mobile web reaches critical mass for advertisers.
2009 – Ashton Kutcher is the first person on Twitter to pick up one million followers… this was back when he was more relevant.
2010 – Facebook has 400 million active users. That’s more than the population of the U.S. and the U.K. combined!


“The best things in life are free.” – English proverb

 “Nothing is free. Everything has to be paid for. For every profit in one thing, payment in some other thing.” – Cassandra Clare, author

 Sometimes, I like to separate my life into two periods: the era known as BSM (Before Social Media) and ASN (After Social Media). BSM meant time spent daydreaming, feeling my patience run out as I waited in line and occasionally twiddling my thumbs. ASN, on the other hand, has devoured all of my spare time.

My commute is forever changed, my fuse is infinitely longer… because there is always some feed or inbox that I can access to feed my voracious hunger for information, news, and personal updates.

One thing that all these relatively new pastimes have in common is that they are “free.” As much as we may all complain every time Facebook, Google or Twitter change the look and functionality of their services, we may do well to remember that we’re not paying a single red cent to use these services. But to say that they’re “free,” as in don’t require payment, may not be entirely true either.

To illustrate my point, let’s take a closer look at Google’s new privacy policy, which kicked in last week. The company spins it as a way to keep things simple. Instead of having a myriad of privacy rules… why not have just one set? The company describes it as a move that “reflects our desire to create a seamless experience for our signed-in users.” Sounds good right? Not so fast!

This new, blanket privacy policy allows the company to engage in more cyber-spying on its users, which it has already started doing, in case you haven’t noticed. A perfect example is my love of polar bear cub videos. Or any types of cute bears. Google knows all about my obsession. I learned this the other day when I signed on to YouTube (which Google owns) and noticed that cute bear videos (along with wild and wacky covers of 80’s songs) figured prominently in the videos that YouTube suggested that I might also enjoy.

Information about every website I visit, every item I Google search, every video I watch online, every picture I look at is stored and often filed into my personal dossier. This tracking is done from the moment I log onto any Google service (Gmail, YouTube, Picasa, Google +, etc.) until I’ve logged out.

A key thing to remember is that Google, whose motto is “Don’t be evil,” makes most of its money from ad revenue. Specifically, when users click on ads. What better way to get you and me to click on ads than by getting to know us and figuring out what we’re interested in?

Google isn’t the only company that’s going to be stepping up its efforts to get to know its users. Facebook is going public later this year and that means more scrutiny of how it makes money and more pressure on it to make more money too. The best way for Facebook to turn a bigger profit is through targeted advertising, and that is achieved via cyber snooping.

So you see, all the wonderful time-fillers that we’ve become addicted to come at a price. This price goes well beyond the time that you dedicate to all the different social media services. The real price for these freemium models is paid for with your privacy; if you’re not careful you are giving it up, piece by little piece.


When I was young I used to think that money was the most important thing in life and now that I am old, I know it is.” – Oscar Wilde

Money doesn’t grow on trees. Well, Canadian money will soon no longer be grown on cotton plants and other types of trees. Instead, we’ll eventually be dealing with something akin to plastic money. The new polymer bills will be phased in to replace the cotton-paper blend notes we currently use, starting in November.

The new and improved plastic money is more durable, which is an obvious bonus in terms of being more cost effective. But another reason for the shift to this latest form of currency is to crack down on counterfeiting. According to the Bank of Canada, “funny money” hit a record high in our country in 2004, when we saw roughly 470 counterfeit notes for every one million in circulation. Apparently, today, we’re down to about 35 phony bills for every million. And this hard-to-replicate plastic money is expected to drive that number down even further.

These new blinged-out bills will include security features like holograms and clear panels, one of which is in the shape of a maple leaf. This type of high-tech currency is already being used in other parts of the world including New Zealand, Hong Kong and Mexico.

Did I mention that this new money is also supposed to be more environmentally friendly than its predecessor? That’s because these bills are expected to last nearly three times longer than the cotton-based notes. And when the plastic money is taken out of circulation it will be recycled. This will mark the first time in our history that “used” Canadian money gets re-used into other products. It all sounds very 21st century, at least on paper.

What I want to know is will the arrival of plastic money mean more mom & pop shops accept fifty and hundred dollar bills? Because it annoys me to no end to end up at some independent gas station or off the beaten path convenience store with no access to an ATM, only to find out they don’t accept anything bigger than a $20 dollar bill. I find that sometimes, technology ends up doing the opposite of what it’s supposed to do. Case in point: plastic is supposed to make it easier to pay for things. In fact, it can make it more complicated and more time-consuming.

This reminds me of the quaint notion that technology would allow us to move to a wireless existence. Instead, I now have a myriad of wires that I need to plug my wireless devices into to recharge them and allow them to connect and communicate with each other. On balance, I’d say the wireless revolution has introduced more wires into my life, not less!

And what about our evolution to a so-called paperless office? Now that I have all these gadgets and super fast access to the internet, I print out hard copies of most things. And judging by the sheer volume of items that go through my blue box, I’d say my “paperless” workspace is quite the opposite.

What about the belief that all these nifty gadgets would allow us to have more free time? I distinctly remember when BlackBerries first hit the market here they were pitched as devices that would allow us to be more productive, work from anywhere and essentially allow us more freedom. Instead, we’ve become addicted to our smartphones, we waste tons of time sifting through an ocean of email, responding to messages, checking various social media platforms and we’re prisoners to the very devices that were supposed to free us from the shackles of our jobs.

This advent of “plastic” money makes me think that my vision of a cash-free society is nothing more than a hoop dream. I thought it might be a done deal when I got my new debit and credit cards in the mail, equipped with the latest chip technology. But it looks like cold, hard cash is here to stay, which is a shame because the last thing my wallet needs is more plastic!


“The saddest aspect of life right now is that science gathers knowledge faster than society gathers wisdom.” – Isaac Asimov

Last night I dreamt of Watson (more like a nightmare, really). Watson isn’t my friend, or some distant ex. He’s the supercomputer created by a team of very smart people at IBM. You may have caught him beating his human competition to take the grand prize at Jeopardy recently.
What freaks me out is that Watson can learn. You see, it doesn’t bother me that things like my iPhone for example, can learn to adjust to my touch, can recognize the cadence of my speech and the tone of my voice. That’s just fine by me! But Watson and his roomful of servers, and access to all kinds of information, taps into all of this not to serve humanity necessarily, but to demolish his human rivals. What happens when he starts posing questions like: “Why don’t I do whatever I want because I’m smarter than all of you?” and “Would you like to engage in nuclear warfare?”
OK, I admit it. Maybe I’ve read one too many Isaac Asimov novels (for the uninitiated he wrote many books, including the iconic I, Robot). I’ve got machines becoming self-aware on the brain. But it’s not such a stretch of the imagination. It wouldn’t be the first time mankind did something and contemplated the consequences after the fact (ie. giving mortgages to people who couldn’t afford them, derivatives, etc.). I do however, recognize the wonderful future applications of the technology used to create Watson: the medical applications, the scientific learning. The possibilities are endless! This could pave the way for improving and saving countless human lives.
Perhaps part of my paranoia stems from the fact that another story has been dominating the headlines: the cyber attacks on federal government networks. The exact motivation, target, origin of these attacks remain unclear. But I think my brain has put two and two together. What if Watson-type technology was used (or is being used) to create a super-hacker. One that is relentless, tireless and far better than its human counterparts? Then what? But that’s just me being alarmist (it happens, I’ll get over it).
When I awoke from my Watson dream, my brain was reeling. And it’s been reeling ever since. I think that Watson has helped me figure out something that’s been bugging me! One question I’m looking to answer is what will replace Web 2.0.? (As in, after social networking, wikis, videosharing sites) I think that Watson is a clue! Let’s recap: Web 2.0 was a game-changer. It offered us all kinds of things that were unprecedented. Like real-time updates on what our friends and family were doing, as well as a direct conduit to communicate with celebrities and once unattainable politicians and other Really Important People. Access to something akin to the sum of all human knowledge (not perfect, not always 100% accurate, but easy to access and searchable… I’m talking about you Wikipedia!). It even gave us a glimpse of what goes on behind closed doors… a sneak-peak at the lives of diplomats and world leaders, fly-on-the-wall status in some of the sauciest boudoirs, etc.
Next up will be access to super computer systems that can take all that knowledge that exists online and digest it for us. This Web 3.0 stuff isn’t just going to present information to us. It will teach us. It will move us. It will shape us in ways we can’t even imagine. For example, if I had a book report due on a particular novel, I might be able to find some existing reports online now. But in the future, I’ll have access to a new, original report, created by a Watson-esque system, based on all existing reports in its database. Right now, I can Google the term “African safari” and pull up photos, articles and video. In the future, a Watson-type computer will be able to incorporate all of that and create a unique “African safari” virtual experience for me. It will be on-demand. It will be multi-dimensional. It will be realistic. It will make whatever we’re doing now look as outdated as an abacus.


“The belly rules the mind.” – Spanish proverb

Eight straight days of mass protests in Egypt have finally led to president Hosni Mubarak’s announcement that he won’t seek re-election in September. But he’s not promising to step down right away either. Instead, he’s pledging to oversee a peaceful transfer of power and he’s promising meaningful reform. Regardless of whether or not you believe what he’s saying, the facts remain: Mubarak is in his 80′s. He’s been ruling over Egypt for three decades now. Some say that under his regime, the country has failed to embrace modern life. Critics say Egypt is overrun by corruption, and weakened because of mismanagement and poor planning.

Mubarak has managed to keep a tight grip on power for thirty years by ruling with an iron fist. His regime has been backed by the U.S. government. Egypt has long been an important ally to America in the Middle East. But reports today suggest that president Obama’s administration told Mubarak that his days in power are coming to an end. This shift in rhetoric from the White House (which had previously straddled the fence, neither calling on Mubarak to step down outright, neither lending its full support to the plight of Egyptian protestors). There’s no telling what the new leadership in Egypt is going to be like. For years, I suppose the U.S. preferred to the Devil You Know to the Devil You Don’t Know. There’s no telling if its new ruler will be friendly to Western interests. But, arguably, change there is overdue.

What sparked the civil unrest is rising food prices. The poor and middle-class suddenly found themselves struggling to be able to afford basics. That sounds like a valid complaint. The thought of someone not being able to purchase something as simple as a loaf of bread or some rice seems unjust. But if you’re thinking about a potential food crisis long-term, then you’ll understand that food inflation, or food-flation as I like to call it, is ultimately a good thing. If food prices keep climbing, the people and the companies that produce food have more incentive to produce more of it because it’s profitable. If it remains more profitable, they’ll have the opportunity to grow their operations and invest in new technology, hire more people, buy better equipment to produce more food, more efficiently.

If food prices were to level off, or plunge, the agri-business would be a less attractive space to be in. People and companies would flock to other industries. And less food would be produced. A scarcity of food would potentially lead to prices running up again, and the cycle would begin once more. But so long as there’s money to be made selling food, that product will continue to be created en masse. Take for example, a farmer with a huge field of corn. If each cob fetches a good price at the market, he’ll continue to grow it. If the price of corn were to drop dramatically and squeeze his margins, he may think of other more profitable ways to use that field: like growing the corn for biofuel (which doesn’t feed anyone), or converting it into a series of factories or dividing up the land and selling it off for housing. This same concept applies to a hog farmer, sugar cane producer, etc.

Now I’m not suggesting that a huge run-up in food prices is good for anyone, especially those who can least afford it. Sticker shock isn’t what I’m arguing for. But as long as food commands a high price, it will remain valuable, as it should be. That may be enough to prevent a real food crisis in the long run.


OK, this is totally biased… but totally hilarious! Nevermind that they mispronounce Bernanke’s last name (Bernank vs. Bernankeeeee!). From the mouths of babes…


Quantitative easing. Yawn!

The term sounds clunky, and boring. But understanding what it is and what it means down the road is important.

First: what the heck is it? Quantitative easing is supposed to be a central bank’s monetary weapon of last resort. But what happens if you’ve painted yourself into a corner? Case in point: Ben Bernanke, the chairman of the U.S. Federal Reserve. He’s already brought rates down to historic lows, and that doesn’t leave him any wiggle room because there’s nothing lower than rock-bottom.

The American central bank, or the Fed as it’s commonly known, has a dual mandate: to keep inflation in check and to promote maximum employment. Well, there’s no sign of inflation being an immediate problem in the States, but the jobless rate there is stubbornly high. Roughly one out of every ten adults in the U.S. is out of work. That’s a huge number of people. Unemployment hasn’t been this high there in more than 25 years! So the Fed has to do something. It can’t count on Congress to pass more stimulus bills. Enter Ben Bernanke to the rescue!

His first go at quantitative easing pumped $1.75 trillion into the U.S. economy during the Great Recession. QE2– or round two– involves the Fed buying $600 billion dollars worth of Treasuries over the course of 8 months. By buying long-term government bonds from banks, it effectively bolsters their balance sheets. The hope is that the banks will lend some of that money to people and to businesses and spur economic growth.

With the Fed buying up Treasuries, that will drive these types of government bond prices up and yields will drop. That may lead investors to put their money into other types of investments that give them a better return… like stocks. So a stock market rally could result. Keep in mind though, that Bernanke has been hinting about QE2 for months and a lot of the “lift” in U.S. stock prices is already baked in.

Another place investors may decide to park their money is commodities. We’ve already seen the price of gold soar to new highs post QE2. But things priced in U.S. dollars are bound to get a push upwards. Take oil, for example. It is bought and sold in greenbacks. One direct consequence of QE2 is the value of the U.S. dollar goes down. That’s because QE: The Sequel is a signal to the rest of the world that the American economy isn’t in the best shape. So investors sell their dollars, driving the currency down. When the dollar falls, oil becomes cheaper to buy… that’s essentially why the two have a negatively correlated relationship. Ditto for gold and many other commodities.

So a surge in oil prices could translate into higher prices at the pumps for you and me. Pricier gold means pricier jewelry (or a better deal if you’re inclined to trade in your gold baubles for cash). More expensive cotton could mean you shell out more for certain clothes. Higher grain, meat, sugar, cocoa and coffee prices equal food (and drink) inflation.

A lower greenback on the other hand, means more purchasing power for Canadians because typically when the American dollar is weak, our dollar rises. Your loonies will go further when you buy things across the border, or order items online from retailers in the States. That will put more pressure on our exporters though, because when our currency is strong, Canadian goods become more expensive to the rest of the world (unless the purchasing country’s currency has appreciated too!). So European imports, like a designer purse for example, may be less expensive.

That’s just a sample of the kinds of things that QE2 has set in motion. A key thing is that if it does what the Fed hopes, and revitalizes the American economy, then Canada stands to benefit. Because when our biggest trading partner gets back on its feet, we win big time.

Now, some people may be wondering where the Fed gets all this money for QE2. Here’s the kicker: it just creates it. Out of nothing. Presto! It creates an account with money in it. Whether or not it should indulge in QE2 because it can, is a topic best left for another entry.

Powered by