Non-Participation As An Effective Weapon Against Tyranny
Brandon Smith
Legitimate revolution takes time, patience and fortitude. Unfortunately, this is a strategic concept that is lost on many Americans today who suffer from a now common ailment of attention deficit disorder and an obsession with immediate gratification. Even some who have their hearts in the right place and who work to defend and resurrect our nation’s founding ideals seem to believe that any action to defeat corrupt oligarchy must be effective immediately, otherwise, it’s not worth the attempt. History, of course, teaches us the opposite.
The American rebellion against the British monarchy was not an abrupt or immediate affair. Anger and unrest over the trespasses of King George simmered for decades. The first British troops stationed with the intent to stifle colonial freedoms arrived in Massachusetts in 1768. The Boston Massacre took place in 1770, and still, the Founders refused to leap into open retaliation. Lexington Green and the “shot heard around the world” did not take place until April 19, 1775. The Revolution took years to culminate into an actual physical war. So what did the colonists do in the meantime? Sit on their hands?
In fact, early Americans employed economic tactics against their enemy long before they picked up muskets and powder. British imports were turned away or destroyed. Clothing and other items normally shipped from Europe to be sold in the colonies were boycotted, while colonists began producing all of their own survival necessities. They refused to participate in the system that was designed to enslave them and this gave them a foundation on which to launch their eventual fight for liberty. Without efforts in economic independence, the American Revolution may not have ever taken place.
I always recall this example whenever I am confronted with a Gung-ho liberty movement activist who demands to know when “we” are going to “do something” about criminal government. Or, when I am confronted by nihilists who proclaim that “we” should have “pulled the trigger” long ago, and now it is “too late to do anything.” The Founders had the same doubts and faced the same naysayers, and had the wisdom to act with the correct force at the correct moment.
The methods of non-participation have been repeated in many dissenting actions against despotic establishments, often with much success. This does not mean that one can necessarily topple tyrants simply by refusing to use their goods or their currency. That would be a childish assumption. Very few tyrants have ever been removed from power without an act of force. However, the process of learning to become self sufficient makes each person more effective as an activist or revolutionary, and thus, more dangerous to those who seek control.
Sadly, one of the greatest threats to the American public in 2013 is the possibility that our government will cut off public access to Federal funds. Our society has become so addicted to government tainted money that up to one third of the country relies on some form of paycheck or welfare from the system. If the system breaks, or is deliberately sabotaged, the sickening level of citizen dependency today makes catastrophe inevitable.
The most interesting aspect of the current “shut-down” situation is the fear it is generating, and the partisan fury it facilitates. Republicans and Democrats are nearly ready to tear each others’ throats out all over the continuance or non-continuance of a political body that no longer functions anyway and has become a middle-man for global banks. This interests me because it is an entirely solvable problem, yet the average American appears completely ignorant of the fix.
Most people are either ready to riot, ready to undermine themselves with bad legislation or a Constitutional Convention, ready for a military coup (another idiotic idea the elites would enjoy), or they have become despondent and uselessly morose, when all they really have to do is consider that perhaps they should not be so dependent on such an unstable economic structure or political dynamic in the first place.
The real power is in OUR hands, and has always been in our hands. Federal welfare, and the idea of the loving provider nanny state are the great illusions. The idea of barreling head first to topple this soulless machine, though satisfying to consider, is also inadequate. A more rounded strategy is required...
The mindless drive for infinite spending often associated with the “Left” is a recipe for utter fiscal disaster in the form of suffocating liabilities, massive deficits, and a hyperinflated dollar. But,the mainstream Republican notion that there will be no consequences if debt default occurs is equally foolish. I have been astounded by wild assertions from the GOP that American tax revenues will be more than enough to cover interest payments on U.S. debts. There are many conservatives and Liberty Movement analysts that should know better than to use official Treasury Department interest numbers and debt numbers to support their arguments.
Given that the real U.S. National debt including entitlement programs is estimated at around $200 trillion, and real deficit figures stand at around $5 trillion per year, I’m wondering how anyone in their right mind could claim that annual tax revenues of $2.5 trillion (2012 direct revenue numbers) could possibly cover foreign interest payments on top of existing liabilities? A massive piece of flesh has to be taken somewhere, and the U.S. financial system is on life support already.
The no-worries default theory does not take into account the fact that median household incomes have been dropping every year for the past five years, thus continuously diminishing tax revenue opportunities. It does not take into account the massive spike in interest payments that would come with a foreign sell-off of U.S. Treasury debt. It does not take into account the possibility that foreign creditors might refuse to accept payments on interest in U.S. dollars (as creditors refused to take German Marks as payment during their hyperinflationary collapse). Nor does it take into account the eventual loss of international faith in the dollar as the world reserve currency, which would rain havoc down upon the U.S. populace in the form of dollar devaluation and exploding prices on every commodity imaginable. Think this cannot or will not happen? The Chinese are now openly calling for it to happen!
There is a false notion floating around alternative economic circles that a default is caused only by the refusal of a nation to pay the interest on existing debt obligations, and that these debt obligations are somehow static and predictable. In fact, a default also takes place when Treasury bonds (consider the vast sums of short-term bonds circulating in foreign coffers) are unexpectedly liquidated by creditors and the debtor nation does not honor them in full. Greece went through a similar “haircut” process, though the domestic and global effects are limited because they do not suffer the distinction of holding the world reserve currency. The U.S. does carry that distinction, and with it comes danger. America has only defaulted on debt payment once during the time in which the dollar ascended as the world reserve, in 1979. This was a "technical" default the government blamed on an organizational error, and was extremely short lived.
Never has the U.S. gone into full default mode while holding world reserve currency status.
Let’s not delude ourselves, fellow conservatives. There is a steep price to be paid for debt default. I fully agree that mathematical reality requires our nation's government to end it's fiscal spending obsession, and if that means we suffer a painful withdrawal, then so be it. Obama cultists don't seem to grasp that this event is inevitable anyway. However, attempting to gloss over the consequences will only make it easier for the mainstream media to demonize us later down the road as the cause of the coming catastrophe. There is no way around it. There is no magical silver bullet solution to avoid the pain. We will have to take our medicine, one way or the other…
Fantasies of a serene default scenario are luring many conservatives into a false sense of security which distracts them from preparation. Never assume we are destined for the best possible outcome. Always assume the shark circling you in the water sees you as dinner.
Whether the White House gets its way, or no-one gets his way and the whole debacle eventually ends in default (which has been my personal prediction since the debate began). America’s economy will face the same destruction on only slightly different timetables. As I pointed out in my last article, The Possible Outcomes Of The Shutdown Theater, the only conceivable winners will be international banks, who can play the market and bleed it dry based merely on what their politician friends tell the media on any given day, and who ultimately WANT to dismantle the United States, our economy, our sovereignty and our Constitution to make way for a new global financial edifice.
If there is no way for the average American to win this game because the rules have been written by our opponent, then perhaps we should stop playing the game altogether.
This means millions of Americans must actively pursue a more independent standard of living. This means each and every person must learn to provide all of their own survival necessities, including food, water, shelter, energy and self defense. This means growing a well-planned garden and educating one’s self on raising livestock. This means learning a valuable trade skill that is useful and always sought after regardless of the state of the mainstream economy. This means striving for off-grid status and cutting ties to electric, gas, and water utility companies. This means training to keep one’s self and one’s family safe in an atmosphere of violence where state sanctioned law enforcement may not be present to protect you. This means building relationships within one’s neighborhood, town or county that allow for proactive organization without the oversight of government. This means providing your own community safety and disaster response without the aid of FEMA. This means establishing alternative local trade (like a barter market) that is NOT dependent on the Internet or any other government watched and regulated network. This means refusing to follow the mandatory directives of Obamacare. This means removing your children from federally funded and dictated schools. This means divorcing yourself completely from government.
There are those within the Liberty Movement that are working to make it easier for regular people to transition away from the mainstream, providing outlets for education and organization for those seeking more independence through non-participation. My own website, www.Alt-Market.com, is geared towards helping people network for barter and mutual aid at the local level.
Oath Keepers, a Constitutional organization of veterans, currently serving military, police, firefighters and concerned civilians, has just launched its “Civilization Preservation Program.” It is designed to set up highly adaptable training groups across the U.S. who will teach any interested citizens within their community the survival methods needed to endure disaster, whether natural or man-made, as well as how to rebuild as the storm subsides.
If one is dependent on a tyrant, one cannot hope to defeat that tyrant. The reason so many people are afraid of the results of government shutdown and debt default is because so many people refuse to step away from the system. The reason so many people are afraid to fight back is because they have seen the establishment as their source of income for so long. If more Americans were self-reliant, if more Americans were willing to give up free goodies from the state, if more Americans built their own economic foundations, a collapse of our financial structure would be meaningless. We could simply sit back comfortably and let it die, for why would we care about the funeral pyre of a vicious and reckless political/corporate suicide train?
As things stand at this moment, though, the death of the system is not something to cheer, no matter how much we might wish it to crumble under the weight of its own criminality. The collapse of the existing system will not be the end of our troubles, only the beginning. Chaos always opens doors for evil men, and they will certainly take full advantage of the chaos triggered by shutdown, default or continued inflationary debt spending.
We must make ourselves ready to resist by making ourselves separate from the monster we plan to fight. Crisis waits for no one, and on the path our nation now walks, crisis is assured.
Link:
http://alt-market.com/articles/1773-non-participation-as-an-effective-weapon-against-tyranny
Wednesday, April 30, 2014
"...most everyone recognizes that freedom of speech entails the fundamental right to make statements that everyone else considers to be despicable. In fact, that’s the real test of a free society in terms of free speech — not whether people are free to say the right things but rather whether people are free to say the despicable things."
Racial Bigotry and the Free Market
by Jacob G. Hornberger
Statists have long taken libertarians to task for opposing mandatory integration laws and defending the right of bigoted owners of business establishments to discriminate against people on the basis of race. They inevitably accuse of libertarians of being racists themselves or supporting racial bigotry by virtue of libertarian opposition to mandatory integration laws.
What statists just don’t get, however, is: one, the importance of principle when it comes to individual liberty, and, two, that the free market, not governmental coercion, is the best way to deal with racial bigotry.
No better example of the libertarian position can be found than the current controversy surrounding Donald Sterling, the owner of the Los Angeles Clippers. Sterling was caught on tape making prejudicial remarks against blacks to his girlfriend, even exhorting her to not associate with blacks.
Yet, let’s notice something important here: Most of the Clipper team is composed of blacks!
How is that possible? Here you have an owner who is clearly prejudiced against blacks and who obviously does not want to associate with them. Why in the world does he have so many blacks on his basketball team? Why not instead hire mostly whites?
The answer is very simple: Sterling’s love of the color green trumps his dislike of the color black.
This is what the market does — it breaks down racial prejudice by imposing a cost on the bigot.
No doubt in Sterling’s ideal world, there would be nothing but white players on his team and perhaps even in the entire NBA. But reality sets in: If Sterling wants to win, he has to hire lots of talented blacks, especially because other owners are doing so. Letting his racial prejudices play out with respect to selecting the players on his team means losing money, lots of money.
So what does Sterling do? He does what his competitors do — notwithstanding his racial prejudices, he fills his team with great basketball players even if they are black, with the aim of winning. Winning usually means higher revenues and profits.
That’s undoubtedly what would have happened throughout American society if mandatory integration laws had not been enacted after segregation was ended. Bigoted business owners would have found it in their economic self-interest to swallow their racial prejudices in order to survive and prosper in the business world.
Equally important, consider how the free market is responding to the Sterling controversy.
People are outraged over Sterling’s comments. Yet, as far as I know, no one, not even the most ardent statist, is calling on the government to punish him, jail him, fine him, or muzzle him.
That’s because most everyone recognizes that freedom of speech entails the fundamental right to make statements that everyone else considers to be despicable. In fact, that’s the real test of a free society in terms of free speech — not whether people are free to say the right things but rather whether people are free to say the despicable things.
Thus, a principled commitment to liberty entails defending the right of the bigot to make bigoted statements.
But the same principle applies to how a bigot uses his business. The business belongs to him. He’s the owner. He has the same fundamental right to discriminate in his business as he as to make bigoted statements. That’s what freedom is also all about — not just to engage in responsible conduct but instead the right to engage in conduct that everyone else considers irresponsible and despicable.
Does that mean that people are powerless to deal with a bigoted business owner who decides to bring his bigotry into the marketplace?
The Sterling controversy is demonstrating the power of market forces — that is, voluntary action as compared to governmental force.
It’s clear that there are lots of people who want to rid professional basketball of Donald Sterling.
Commercial sponsors are fleeing the Clippers in droves, thereby depriving Sterling of those high revenues.
Basketball fans are talking about boycotting Clippers’ games. Imagine the effect that would have on Sterling, as he would have to continue paying his players enormous salaries while watching his income from ticket sales take a dive.
The boycott option would probably be exercised if league officials did nothing. But league officials are doing something. In fact, they’re doing a lot. After verifying that Sterling did, in fact, make the derogatory statements, they imposed a lifetime ban on him from participating in the NBA, imposed a $2.5 million fine on him, and are calling on the other owners to band together to urge Sterling to sell the team.
Why are they doing that? One possibility is that league officials too are offended by Sterling’s remarks. But another possibility is that the color of green is important to them too. They know that if they don’t take drastic action, sponsors and fans might resort to boycotts on not only the Clippers but on the league itself.
Notice something else taking place here: the power of social ostracism. That is serving as a powerful force that is sending circumstances in a positive direction, all without governmental intervention.
The Sterling controversy demonstrates that libertarians have been right on this issue the entire time. Segregation should have been ended and forced integration should never have been enacted. A bigot has right to be a bigot, not only with respect to his speech but also his actions. Let the free market, not governmental force, nudge the bigot in the right direction.
Link:
http://fff.org/2014/04/30/racial-bigotry-and-the-free-market/
by Jacob G. Hornberger
Statists have long taken libertarians to task for opposing mandatory integration laws and defending the right of bigoted owners of business establishments to discriminate against people on the basis of race. They inevitably accuse of libertarians of being racists themselves or supporting racial bigotry by virtue of libertarian opposition to mandatory integration laws.
What statists just don’t get, however, is: one, the importance of principle when it comes to individual liberty, and, two, that the free market, not governmental coercion, is the best way to deal with racial bigotry.
No better example of the libertarian position can be found than the current controversy surrounding Donald Sterling, the owner of the Los Angeles Clippers. Sterling was caught on tape making prejudicial remarks against blacks to his girlfriend, even exhorting her to not associate with blacks.
Yet, let’s notice something important here: Most of the Clipper team is composed of blacks!
How is that possible? Here you have an owner who is clearly prejudiced against blacks and who obviously does not want to associate with them. Why in the world does he have so many blacks on his basketball team? Why not instead hire mostly whites?
The answer is very simple: Sterling’s love of the color green trumps his dislike of the color black.
This is what the market does — it breaks down racial prejudice by imposing a cost on the bigot.
No doubt in Sterling’s ideal world, there would be nothing but white players on his team and perhaps even in the entire NBA. But reality sets in: If Sterling wants to win, he has to hire lots of talented blacks, especially because other owners are doing so. Letting his racial prejudices play out with respect to selecting the players on his team means losing money, lots of money.
So what does Sterling do? He does what his competitors do — notwithstanding his racial prejudices, he fills his team with great basketball players even if they are black, with the aim of winning. Winning usually means higher revenues and profits.
That’s undoubtedly what would have happened throughout American society if mandatory integration laws had not been enacted after segregation was ended. Bigoted business owners would have found it in their economic self-interest to swallow their racial prejudices in order to survive and prosper in the business world.
Equally important, consider how the free market is responding to the Sterling controversy.
People are outraged over Sterling’s comments. Yet, as far as I know, no one, not even the most ardent statist, is calling on the government to punish him, jail him, fine him, or muzzle him.
That’s because most everyone recognizes that freedom of speech entails the fundamental right to make statements that everyone else considers to be despicable. In fact, that’s the real test of a free society in terms of free speech — not whether people are free to say the right things but rather whether people are free to say the despicable things.
Thus, a principled commitment to liberty entails defending the right of the bigot to make bigoted statements.
But the same principle applies to how a bigot uses his business. The business belongs to him. He’s the owner. He has the same fundamental right to discriminate in his business as he as to make bigoted statements. That’s what freedom is also all about — not just to engage in responsible conduct but instead the right to engage in conduct that everyone else considers irresponsible and despicable.
Does that mean that people are powerless to deal with a bigoted business owner who decides to bring his bigotry into the marketplace?
The Sterling controversy is demonstrating the power of market forces — that is, voluntary action as compared to governmental force.
It’s clear that there are lots of people who want to rid professional basketball of Donald Sterling.
Commercial sponsors are fleeing the Clippers in droves, thereby depriving Sterling of those high revenues.
Basketball fans are talking about boycotting Clippers’ games. Imagine the effect that would have on Sterling, as he would have to continue paying his players enormous salaries while watching his income from ticket sales take a dive.
The boycott option would probably be exercised if league officials did nothing. But league officials are doing something. In fact, they’re doing a lot. After verifying that Sterling did, in fact, make the derogatory statements, they imposed a lifetime ban on him from participating in the NBA, imposed a $2.5 million fine on him, and are calling on the other owners to band together to urge Sterling to sell the team.
Why are they doing that? One possibility is that league officials too are offended by Sterling’s remarks. But another possibility is that the color of green is important to them too. They know that if they don’t take drastic action, sponsors and fans might resort to boycotts on not only the Clippers but on the league itself.
Notice something else taking place here: the power of social ostracism. That is serving as a powerful force that is sending circumstances in a positive direction, all without governmental intervention.
The Sterling controversy demonstrates that libertarians have been right on this issue the entire time. Segregation should have been ended and forced integration should never have been enacted. A bigot has right to be a bigot, not only with respect to his speech but also his actions. Let the free market, not governmental force, nudge the bigot in the right direction.
Link:
http://fff.org/2014/04/30/racial-bigotry-and-the-free-market/
"Should personal opinions be illegal? Should personal opinions result in the loss of your livelihood?"
Should Opinions Be Illegal? The Donald Sterling Scandal
By Shane Kastler
I suspect that Los Angeles Clippers owner Donald Sterling is a rich, arrogant jerk who might have unfavorable views toward those different from him. I also suspect many of the young black athletes he employees are rich, arrogant jerks who have unfavorable views towards those different from them. Arrogance is not limited to any particular race. However in America today, arrogance from a “minority” is completely acceptable. While arrogance from a white person might cost them their livliehood.
With all of that said, let me offer a word of caution to the media and all who (in typical fashion) have blown the Sterling scandal out of proportion. Who cares if Donald Sterling is a racist? It's unfortunate. It might be nice if his views changed. But in the end, who cares? And why on earth would the media be so enamored with the story anyway? And furthermore, why should a black person care for one second what Donald Sterling thinks of them? He's obviously willing to shell out millions upon millions to pay black athletes to play for his team. If he's such a racist, then why would he not employee white players only? Answer, because he's smart enough to know who will win ballgames and make money for his organization. Which leads me to another question.
Should personal opinions be illegal? Should personal opinions result in the loss of your livelihood? Frighteningly, the answer from most Americans seems to be yes. But the U.S. Constitution and common sense say otherwise. No matter how wrong Sterling's opinions might be, he has a right to hold them. Yet the media induced lynch mob would seek to destroy anyone who doesn't kowtow to their mantra. Al Sharpton (a racist in his own right) has called for the Clippers to be “taken away from Sterling.” By who? Who does Sharpton think has the authority or the right to do such a thing? The government? This is none of their business.
Others are calling for the NBA to seize the team away from Sterling. And for what reason? Because he made some statements to his “girlfriend” expressing irritation at her constantly posting pictures of herself with young, black men. While that might be construed as racist, one suspects that Sterling might have been equally annoyed if she was constantly posting pictures with young white men. The “race” probably isn't the issue, near as much as the “gender” is.
On Tuesday, NBA commissioner Adam Silver banned Sterling for life. That's a truly scary thought. What if Sterling changes his views? What if he becomes a poster child of political correctness? There's no forgiveness? He's banned for life because of his opinion? He's banned for life because of statements made in a personal conversation with a ditzy girlfriend who secretly taped him? My goal is not to defend Sterling. As I already stated, I assume he's an arrogant jerk. But arrogant jerks are entitled to their opinions too. Former NBA player (and current executive) Larry Johnson called for all-black basketball league. This is far more racist than anything Sterling said. Will the NBA ban Johnson? Will ESPN air round the clock coverage of Johnson's comments? No. He's black. Which means he's allowed to be a racist; according to the warped mindset of modern America. What if Larry Bird called for an all white league? He'd be crucified.
Personally I haven't watched an NBA game in 20 years and don't plan to for the next 20. The league is nothing more than a politically correct cesspool of thuggery and arrogance. Once Jordan, Magic, Larry, and Dr. J retired I quit watching. The league has spiraled down hill ever since. And if your opinions are anything contrary to the status quo you can expect to be banned. For life.
Be careful in how quickly you might condemn Sterling for his alleged racism. And furthermore, be careful in espousing how you think he should be punished. Opinions are not against the law. And the same powers that would silence the racial opinions of Sterling today, might silence your religious or political opinions tomorrow. Don't fall into the snares of the media induced hysteria. Sterling's comments might be a little annoying; but in the grand scheme of things they are a non-issue writ large by the politically correct establishment that controls much of this country. Violent actions should be illegal. Personal opinions (no matter how stupid they may be) should not be. “Live and let live!” Isn't that what the Al Sharptons of the world always espouse? Perhaps it's time for them to practice what they preach.
Link:
http://www.economicpolicyjournal.com/2014/04/should-opinions-be-illegal-donald.html#more
By Shane Kastler
I suspect that Los Angeles Clippers owner Donald Sterling is a rich, arrogant jerk who might have unfavorable views toward those different from him. I also suspect many of the young black athletes he employees are rich, arrogant jerks who have unfavorable views towards those different from them. Arrogance is not limited to any particular race. However in America today, arrogance from a “minority” is completely acceptable. While arrogance from a white person might cost them their livliehood.
With all of that said, let me offer a word of caution to the media and all who (in typical fashion) have blown the Sterling scandal out of proportion. Who cares if Donald Sterling is a racist? It's unfortunate. It might be nice if his views changed. But in the end, who cares? And why on earth would the media be so enamored with the story anyway? And furthermore, why should a black person care for one second what Donald Sterling thinks of them? He's obviously willing to shell out millions upon millions to pay black athletes to play for his team. If he's such a racist, then why would he not employee white players only? Answer, because he's smart enough to know who will win ballgames and make money for his organization. Which leads me to another question.
Should personal opinions be illegal? Should personal opinions result in the loss of your livelihood? Frighteningly, the answer from most Americans seems to be yes. But the U.S. Constitution and common sense say otherwise. No matter how wrong Sterling's opinions might be, he has a right to hold them. Yet the media induced lynch mob would seek to destroy anyone who doesn't kowtow to their mantra. Al Sharpton (a racist in his own right) has called for the Clippers to be “taken away from Sterling.” By who? Who does Sharpton think has the authority or the right to do such a thing? The government? This is none of their business.
Others are calling for the NBA to seize the team away from Sterling. And for what reason? Because he made some statements to his “girlfriend” expressing irritation at her constantly posting pictures of herself with young, black men. While that might be construed as racist, one suspects that Sterling might have been equally annoyed if she was constantly posting pictures with young white men. The “race” probably isn't the issue, near as much as the “gender” is.
On Tuesday, NBA commissioner Adam Silver banned Sterling for life. That's a truly scary thought. What if Sterling changes his views? What if he becomes a poster child of political correctness? There's no forgiveness? He's banned for life because of his opinion? He's banned for life because of statements made in a personal conversation with a ditzy girlfriend who secretly taped him? My goal is not to defend Sterling. As I already stated, I assume he's an arrogant jerk. But arrogant jerks are entitled to their opinions too. Former NBA player (and current executive) Larry Johnson called for all-black basketball league. This is far more racist than anything Sterling said. Will the NBA ban Johnson? Will ESPN air round the clock coverage of Johnson's comments? No. He's black. Which means he's allowed to be a racist; according to the warped mindset of modern America. What if Larry Bird called for an all white league? He'd be crucified.
Personally I haven't watched an NBA game in 20 years and don't plan to for the next 20. The league is nothing more than a politically correct cesspool of thuggery and arrogance. Once Jordan, Magic, Larry, and Dr. J retired I quit watching. The league has spiraled down hill ever since. And if your opinions are anything contrary to the status quo you can expect to be banned. For life.
Be careful in how quickly you might condemn Sterling for his alleged racism. And furthermore, be careful in espousing how you think he should be punished. Opinions are not against the law. And the same powers that would silence the racial opinions of Sterling today, might silence your religious or political opinions tomorrow. Don't fall into the snares of the media induced hysteria. Sterling's comments might be a little annoying; but in the grand scheme of things they are a non-issue writ large by the politically correct establishment that controls much of this country. Violent actions should be illegal. Personal opinions (no matter how stupid they may be) should not be. “Live and let live!” Isn't that what the Al Sharptons of the world always espouse? Perhaps it's time for them to practice what they preach.
Link:
http://www.economicpolicyjournal.com/2014/04/should-opinions-be-illegal-donald.html#more
Where's the outrage in the American press??? Oh, too busy with some crazy NBA owner...
Widow devastated as judge rules her $280,000 home will be sold over unpaid $6.30 tax bill
Eileen Battisti insists she paid all taxes in full and was unaware she owed the addition $6.30
The court ruled that she was given multiple notices about the looming sale and that she even acknowledged receiving them
The home sold for less than half market value at a 2011 auction
ByRyan Gormanand Associated Press Reporter
A widow has been told for the second time by a Pennsylvania court that her home's sale at auction after she failed to pay property taxes is valid - she owed only $6.30 at the time it was sold.
Eileen Battisti, 53, of Aliquippa, lost legal rights to her $280,000 home over two years ago after failing to pay the paltry sum but has made multiple appeals on grounds she did not know it was owed.
The most recent decision made last week denied her request to reverse the September 2011 sale of a house she is still reportedly living in.
'I paid everything, and didn't know about the $6.30,' Battisti said. 'For the house to be sold just because of $6.30 is crazy.'
She had previously owed other taxes, the court noted, but at the time of the sale she owed just $235, including other interest and fees.
Beaver County Common Pleas Judge Gus Kwidis wrote in his ruling that the county tax claim bureau complied with notification requirements in state law before the auction.
'There is no doubt that (she) had actual receipt of the notification of the tax upset sale on July 7, 2011, and Aug. 16, 2011,' the judge wrote. 'Moreover, on Aug. 12, 2011, a notice of sale was sent by first class mail and was not returned.'
Battisti also admitted to receiving those notices, the judge asserted, according to the Pittsburgh Tribune-Review.
He write in the ruling there “is no doubt' Battisti “had actual receipt” of them.
The property sold for about $116,000, and she is entitled to $108,039 if subsequent appeals are unsuccessful, according to the paper.
'She's going to get that money, but she's going to lose her house. All the notice requirements were met,' wrote Kwidis. 'In tax assessment laws, even if I feel sorry for her, I can't do anything to help her.
'Everyone felt bad about it.'
Read more: http://www.dailymail.co.uk/news/article-2615280/OK-sell-widows-home-6-bill-judge-rules.html#ixzz30Mp94m3k
Eileen Battisti insists she paid all taxes in full and was unaware she owed the addition $6.30
The court ruled that she was given multiple notices about the looming sale and that she even acknowledged receiving them
The home sold for less than half market value at a 2011 auction
ByRyan Gormanand Associated Press Reporter
A widow has been told for the second time by a Pennsylvania court that her home's sale at auction after she failed to pay property taxes is valid - she owed only $6.30 at the time it was sold.
Eileen Battisti, 53, of Aliquippa, lost legal rights to her $280,000 home over two years ago after failing to pay the paltry sum but has made multiple appeals on grounds she did not know it was owed.
The most recent decision made last week denied her request to reverse the September 2011 sale of a house she is still reportedly living in.
'I paid everything, and didn't know about the $6.30,' Battisti said. 'For the house to be sold just because of $6.30 is crazy.'
She had previously owed other taxes, the court noted, but at the time of the sale she owed just $235, including other interest and fees.
Beaver County Common Pleas Judge Gus Kwidis wrote in his ruling that the county tax claim bureau complied with notification requirements in state law before the auction.
'There is no doubt that (she) had actual receipt of the notification of the tax upset sale on July 7, 2011, and Aug. 16, 2011,' the judge wrote. 'Moreover, on Aug. 12, 2011, a notice of sale was sent by first class mail and was not returned.'
Battisti also admitted to receiving those notices, the judge asserted, according to the Pittsburgh Tribune-Review.
He write in the ruling there “is no doubt' Battisti “had actual receipt” of them.
The property sold for about $116,000, and she is entitled to $108,039 if subsequent appeals are unsuccessful, according to the paper.
'She's going to get that money, but she's going to lose her house. All the notice requirements were met,' wrote Kwidis. 'In tax assessment laws, even if I feel sorry for her, I can't do anything to help her.
'Everyone felt bad about it.'
Read more: http://www.dailymail.co.uk/news/article-2615280/OK-sell-widows-home-6-bill-judge-rules.html#ixzz30Mp94m3k
End fractional reserve banking...
Prominent Economists Call for End of Fractional Reserve Banking
Washington’s Blog
Excessive leverage by the banks was one of the main causes of the Great Depression and of the 2008 financial crisis.
As such, lower levels of “fractional reserve banking” – i.e. how many dollars a bank lends out compared to the amount of deposits it has on hand – the more stable the economy will be.
But economist Steve Keen notes (citing Table 10 in Yueh-Yun C. OBrien, 2007. “Reserve Requirement Systems in OECD Countries”, Finance and Economics Discussion Series, Divisions of Research & Statistics and Monetary Affairs, Federal Reserve Board):
The US Federal Reserve sets a Required Reserve Ratio of 10%, but applies this only to deposits by individuals; banks have no reserve requirement at all for deposits by companies.
So huge swaths of loans are not subject to any reserve requirements.
Indeed, Ben Bernanke proposed the elimination of all reserve requirements for banks:
The Federal Reserve believes it is possible that, ultimately, its operating framework will allow the elimination of minimum reserve requirements, which impose costs and distortions on the banking system.
Economist Keen informs Washington’s Blog that about 6 OECD countries have already done away with reserve requirements altogether (Australia, Mexico, Canada, New Zealand, Sweden and the UK).
But there is a growing recognition that this is going in the wrong direction, because fractional reserve banking can destabilize the economy (and credit can easily be created by the government itself.)
It was big news this week when one of the world’s most prominent economics writers – liberal economist Martin Wolf – advocated doing away with fractional reserve banking altogether… i.e. requiring that banks only loan out as much money as they actually have on hand in the form of customer deposits:
Printing counterfeit banknotes is illegal, but creating private money is not. The interdependence between the state and the businesses that can do this is the source of much of the instability of our economies. It could – and should – be terminated.
***
What is to be done? A minimum response would leave this industry largely as it is but both tighten regulation and insist that a bigger proportion of the balance sheet be financed with equity or credibly loss-absorbing debt. I discussed this approach last week. Higher capital is the recommendation made by Anat Admati of Stanford and Martin Hellwig of the Max Planck Institute in The Bankers’ New Clothes.
A maximum response would be to give the state a monopoly on money creation. One of the most important such proposals was in the Chicago Plan, advanced in the 1930s by, among others, a great economist, Irving Fisher. Its core was the requirement for 100 per cent reserves against deposits. Fisher argued that this would greatly reduce business cycles, end bank runs and drastically reduce public debt. A 2012 study by International Monetary Fund staff suggests this plan could work well.
Similar ideas have come from Laurence Kotlikoff of Boston University in Jimmy Stewart is Dead, and Andrew Jackson and Ben Dyson in Modernising Money.
***
Opponents will argue that the economy would die for lack of credit. I was once sympathetic to that argument. But only about 10 per cent of UK bank lending has financed business investment in sectors other than commercial property. We could find other ways of funding this.
Our financial system is so unstable because the state first allowed it to create almost all the money in the economy and was then forced to insure it when performing that function. This is a giant hole at the heart of our market economies. It could be closed by separating the provision of money, rightly a function of the state, from the provision of finance, a function of the private sector.
(The IMF study is here.)
In fact, a lot of experts have backed this or similar proposals, including:
Bank of England Chief Mervyn King
Prominent conservative economist Milton Friedman
Prominent liberal economist Irving Fisher
Prominent conservative economist Lawrence Kotlikoff
Prominent liberal economist James Tobin
Prominent conservative economist John Cochrane
Prominent liberal economist Herman Daly
Prominent British economist John Kay
Conservative Spanish economics professor Huerta de Soto
German economist Thorsten Polleit
Conservative French economist Jörg Guido Hülsmann
Head economics writer at the Guardian Ambrose Evans-Pritchard
Bloomberg columnist Matthew C. Klein
Interestingly, the Chicago Plan for full reserve banking came very close to passing in 1934. But the unfortunate death of one of its main Congressional sponsors – Senator Bronson M. Cutting – in a plane crash reversed the momentum for the bill.
As Wikipedia notes:
Cutting played a key role in the political struggles over the reform of banking which Roosevelt undertook while dealing with the Great Depression, and which resulted in the Banking Reform Acts of 1933 and 1935. As a supporter of the Chicago Plan proposed by economist Irving Fisher and others at the University of Chicago, Cutting was among a handful of influential Senators who might have been able to remove from the private banks their ability to manipulate the money supply by enforcing a 100 percent reserve requirement for all credit creation, as stipulated in the Chicago Plan. His unfortunate death in an airliner crash cut short what may have been his most enduring legacy to the nation.
Link:
http://www.washingtonsblog.com/2014/04/conservative-economist-wants-basically-ban-banking.html
Washington’s Blog
Excessive leverage by the banks was one of the main causes of the Great Depression and of the 2008 financial crisis.
As such, lower levels of “fractional reserve banking” – i.e. how many dollars a bank lends out compared to the amount of deposits it has on hand – the more stable the economy will be.
But economist Steve Keen notes (citing Table 10 in Yueh-Yun C. OBrien, 2007. “Reserve Requirement Systems in OECD Countries”, Finance and Economics Discussion Series, Divisions of Research & Statistics and Monetary Affairs, Federal Reserve Board):
The US Federal Reserve sets a Required Reserve Ratio of 10%, but applies this only to deposits by individuals; banks have no reserve requirement at all for deposits by companies.
So huge swaths of loans are not subject to any reserve requirements.
Indeed, Ben Bernanke proposed the elimination of all reserve requirements for banks:
The Federal Reserve believes it is possible that, ultimately, its operating framework will allow the elimination of minimum reserve requirements, which impose costs and distortions on the banking system.
Economist Keen informs Washington’s Blog that about 6 OECD countries have already done away with reserve requirements altogether (Australia, Mexico, Canada, New Zealand, Sweden and the UK).
But there is a growing recognition that this is going in the wrong direction, because fractional reserve banking can destabilize the economy (and credit can easily be created by the government itself.)
It was big news this week when one of the world’s most prominent economics writers – liberal economist Martin Wolf – advocated doing away with fractional reserve banking altogether… i.e. requiring that banks only loan out as much money as they actually have on hand in the form of customer deposits:
Printing counterfeit banknotes is illegal, but creating private money is not. The interdependence between the state and the businesses that can do this is the source of much of the instability of our economies. It could – and should – be terminated.
***
What is to be done? A minimum response would leave this industry largely as it is but both tighten regulation and insist that a bigger proportion of the balance sheet be financed with equity or credibly loss-absorbing debt. I discussed this approach last week. Higher capital is the recommendation made by Anat Admati of Stanford and Martin Hellwig of the Max Planck Institute in The Bankers’ New Clothes.
A maximum response would be to give the state a monopoly on money creation. One of the most important such proposals was in the Chicago Plan, advanced in the 1930s by, among others, a great economist, Irving Fisher. Its core was the requirement for 100 per cent reserves against deposits. Fisher argued that this would greatly reduce business cycles, end bank runs and drastically reduce public debt. A 2012 study by International Monetary Fund staff suggests this plan could work well.
Similar ideas have come from Laurence Kotlikoff of Boston University in Jimmy Stewart is Dead, and Andrew Jackson and Ben Dyson in Modernising Money.
***
Opponents will argue that the economy would die for lack of credit. I was once sympathetic to that argument. But only about 10 per cent of UK bank lending has financed business investment in sectors other than commercial property. We could find other ways of funding this.
Our financial system is so unstable because the state first allowed it to create almost all the money in the economy and was then forced to insure it when performing that function. This is a giant hole at the heart of our market economies. It could be closed by separating the provision of money, rightly a function of the state, from the provision of finance, a function of the private sector.
(The IMF study is here.)
In fact, a lot of experts have backed this or similar proposals, including:
Bank of England Chief Mervyn King
Prominent conservative economist Milton Friedman
Prominent liberal economist Irving Fisher
Prominent conservative economist Lawrence Kotlikoff
Prominent liberal economist James Tobin
Prominent conservative economist John Cochrane
Prominent liberal economist Herman Daly
Prominent British economist John Kay
Conservative Spanish economics professor Huerta de Soto
German economist Thorsten Polleit
Conservative French economist Jörg Guido Hülsmann
Head economics writer at the Guardian Ambrose Evans-Pritchard
Bloomberg columnist Matthew C. Klein
Interestingly, the Chicago Plan for full reserve banking came very close to passing in 1934. But the unfortunate death of one of its main Congressional sponsors – Senator Bronson M. Cutting – in a plane crash reversed the momentum for the bill.
As Wikipedia notes:
Cutting played a key role in the political struggles over the reform of banking which Roosevelt undertook while dealing with the Great Depression, and which resulted in the Banking Reform Acts of 1933 and 1935. As a supporter of the Chicago Plan proposed by economist Irving Fisher and others at the University of Chicago, Cutting was among a handful of influential Senators who might have been able to remove from the private banks their ability to manipulate the money supply by enforcing a 100 percent reserve requirement for all credit creation, as stipulated in the Chicago Plan. His unfortunate death in an airliner crash cut short what may have been his most enduring legacy to the nation.
Link:
http://www.washingtonsblog.com/2014/04/conservative-economist-wants-basically-ban-banking.html
In case you've been living in a cave the past 7 years...
17 Facts To Show To Anyone That Believes That The U.S. Economy Is Just Fine
By Michael Snyder
No, the economy is most definitely not "recovering". Despite what you may hear from the politicians and from the mainstream media, the truth is that the U.S. economy is in far worse shape than it was prior to the last recession. In fact, we are still pretty much where we were at when the last recession finally ended. When the financial crisis of 2008 struck, it took us down to a much lower level economically. Thankfully, things have at least stabilized at this much lower level. For example, the percentage of working age Americans that are employed has stayed remarkably flat for the past four years. We should be grateful that things have not continued to get even worse. It is almost as if someone has hit the "pause button" on the U.S. economy. But things are definitely not getting better, and there are a whole host of signs that this bubble of false stability will soon come to an end and that our economic decline will accelerate once again. The following are 17 facts to show to anyone that believes that the U.S. economy is just fine...
#1 The homeownership rate in the United States has dropped to the lowest level in 19 years.
#2 Consumer spending for durable goods has dropped by 3.23 percent since November. This is a clear sign that an economic slowdown is ahead.
#3 Major retailers are closing stores at the fastest pace that we have seen since the collapse of Lehman Brothers.
#4 According to the Bureau of Labor Statistics, 20 percent of all families in the United States do not have a single member that is employed. That means that one out of every five families in the entire country is completely unemployed.
#5 There are 1.3 million fewer jobs in the U.S. economy than when the last recession began in December 2007. Meanwhile, our population has continued to grow steadily since that time.
#6 According to a new report from the National Employment Law Project, the quality of the jobs that have been "created" since the end of the last recession does not match the quality of the jobs lost during the last recession...
•Lower-wage industries constituted 22 percent of recession losses, but 44 percent of recovery growth.
•Mid-wage industries constituted 37 percent of recession losses, but only 26 percent of recovery growth.
•Higher-wage industries constituted 41 percent of recession losses, and 30 percent of recovery growth.
#7 After adjusting for inflation, men who work full-time in America today make less money than men who worked full-time in America 40 years ago.
#8 It is hard to believe, but 62 percent of all Americans make $20 or less an hour at this point.
#9 Nine of the top ten occupations in the U.S. pay an average wage of less than $35,000 a year.
#10 The middle class in Canada now makes more money than the middle class in the United States does.
#11 According to one recent study, 40 percent of all Americans could not come up with $2000 right now even if there was a major emergency.
#12 Less than one out of every four Americans has enough money put away to cover six months of expenses if there was a job loss or major emergency.
#13 An astounding 56 percent of all Americans have subprime credit in 2014.
#14 As I wrote about the other day, there are now 49 million Americans that are dealing with food insecurity.
#15 Ten years ago, the number of women in the U.S. that had jobs outnumbered the number of women in the U.S. on food stamps by more than a 2 to 1 margin. But now the number of women in the U.S. on food stamps actually exceeds the number of women that have jobs.
#16 69 percent of the federal budget is spent either on entitlements or on welfare programs.
#17 The number of Americans receiving benefits from the federal government each month exceeds the number of full-time workers in the private sector by more than 60 million.
Taken individually, those numbers are quite remarkable.
Taken collectively, they are absolutely breathtaking.
Yes, things have been improving for the wealthy for the last several years. The stock market has soared to new record highs and real estate prices in the Hamptons have skyrocketed to unprecedented heights.
But that is not the real economy. In the real economy, the middle class is being squeezed out of existence. The quality of our jobs is declining and prices just keep rising. This reality was reflected quite well in a comment that one of my readers left on one of my recent articles...
It is getting worse each passing month. The food bank I help out, has barely squeaked by the last 3 months. Donors are having to pull back, to take care of their own families. Wages down, prices up, simple math tells you we can not hold out much longer. Things are going up so fast, you have to adopt a new way of thinking. Example I just had to put new tires on my truck. Normally I would have tried to get by to next winter. But with the way prices are moving, I decide to get them while I could still afford them. It is the same way with food. I see nothing that will stop the upward trend for quite a while. So if you have a little money, and the space, buy it while you can afford it. And never forget, there will be some people worse off than you. Help them if you can.
And the false stock bubble that the wealthy are enjoying right now will not last that much longer. It is an artificial bubble that has been pumped up by unprecedented money printing by the Federal Reserve, and like all bubbles that the Fed creates, it will eventually burst.
None of the long-term trends that are systematically destroying our economy have been addressed, and none of our major economic problems have been fixed. In fact, as I showed in this recent article, we are actually in far worse shape than we were just prior to the last major financial crisis.
Let us hope that this current bubble of false stability lasts for as long as possible.
That is what I am hoping for.
But let us not be deceived into thinking that it is permanent.
It will soon burst, and then the real pain will begin.
Link:
http://theeconomiccollapseblog.com/archives/17-facts-to-show-to-anyone-that-believes-that-the-u-s-economy-is-just-fine
By Michael Snyder
No, the economy is most definitely not "recovering". Despite what you may hear from the politicians and from the mainstream media, the truth is that the U.S. economy is in far worse shape than it was prior to the last recession. In fact, we are still pretty much where we were at when the last recession finally ended. When the financial crisis of 2008 struck, it took us down to a much lower level economically. Thankfully, things have at least stabilized at this much lower level. For example, the percentage of working age Americans that are employed has stayed remarkably flat for the past four years. We should be grateful that things have not continued to get even worse. It is almost as if someone has hit the "pause button" on the U.S. economy. But things are definitely not getting better, and there are a whole host of signs that this bubble of false stability will soon come to an end and that our economic decline will accelerate once again. The following are 17 facts to show to anyone that believes that the U.S. economy is just fine...
#1 The homeownership rate in the United States has dropped to the lowest level in 19 years.
#2 Consumer spending for durable goods has dropped by 3.23 percent since November. This is a clear sign that an economic slowdown is ahead.
#3 Major retailers are closing stores at the fastest pace that we have seen since the collapse of Lehman Brothers.
#4 According to the Bureau of Labor Statistics, 20 percent of all families in the United States do not have a single member that is employed. That means that one out of every five families in the entire country is completely unemployed.
#5 There are 1.3 million fewer jobs in the U.S. economy than when the last recession began in December 2007. Meanwhile, our population has continued to grow steadily since that time.
#6 According to a new report from the National Employment Law Project, the quality of the jobs that have been "created" since the end of the last recession does not match the quality of the jobs lost during the last recession...
•Lower-wage industries constituted 22 percent of recession losses, but 44 percent of recovery growth.
•Mid-wage industries constituted 37 percent of recession losses, but only 26 percent of recovery growth.
•Higher-wage industries constituted 41 percent of recession losses, and 30 percent of recovery growth.
#7 After adjusting for inflation, men who work full-time in America today make less money than men who worked full-time in America 40 years ago.
#8 It is hard to believe, but 62 percent of all Americans make $20 or less an hour at this point.
#9 Nine of the top ten occupations in the U.S. pay an average wage of less than $35,000 a year.
#10 The middle class in Canada now makes more money than the middle class in the United States does.
#11 According to one recent study, 40 percent of all Americans could not come up with $2000 right now even if there was a major emergency.
#12 Less than one out of every four Americans has enough money put away to cover six months of expenses if there was a job loss or major emergency.
#13 An astounding 56 percent of all Americans have subprime credit in 2014.
#14 As I wrote about the other day, there are now 49 million Americans that are dealing with food insecurity.
#15 Ten years ago, the number of women in the U.S. that had jobs outnumbered the number of women in the U.S. on food stamps by more than a 2 to 1 margin. But now the number of women in the U.S. on food stamps actually exceeds the number of women that have jobs.
#16 69 percent of the federal budget is spent either on entitlements or on welfare programs.
#17 The number of Americans receiving benefits from the federal government each month exceeds the number of full-time workers in the private sector by more than 60 million.
Taken individually, those numbers are quite remarkable.
Taken collectively, they are absolutely breathtaking.
Yes, things have been improving for the wealthy for the last several years. The stock market has soared to new record highs and real estate prices in the Hamptons have skyrocketed to unprecedented heights.
But that is not the real economy. In the real economy, the middle class is being squeezed out of existence. The quality of our jobs is declining and prices just keep rising. This reality was reflected quite well in a comment that one of my readers left on one of my recent articles...
It is getting worse each passing month. The food bank I help out, has barely squeaked by the last 3 months. Donors are having to pull back, to take care of their own families. Wages down, prices up, simple math tells you we can not hold out much longer. Things are going up so fast, you have to adopt a new way of thinking. Example I just had to put new tires on my truck. Normally I would have tried to get by to next winter. But with the way prices are moving, I decide to get them while I could still afford them. It is the same way with food. I see nothing that will stop the upward trend for quite a while. So if you have a little money, and the space, buy it while you can afford it. And never forget, there will be some people worse off than you. Help them if you can.
And the false stock bubble that the wealthy are enjoying right now will not last that much longer. It is an artificial bubble that has been pumped up by unprecedented money printing by the Federal Reserve, and like all bubbles that the Fed creates, it will eventually burst.
None of the long-term trends that are systematically destroying our economy have been addressed, and none of our major economic problems have been fixed. In fact, as I showed in this recent article, we are actually in far worse shape than we were just prior to the last major financial crisis.
Let us hope that this current bubble of false stability lasts for as long as possible.
That is what I am hoping for.
But let us not be deceived into thinking that it is permanent.
It will soon burst, and then the real pain will begin.
Link:
http://theeconomiccollapseblog.com/archives/17-facts-to-show-to-anyone-that-believes-that-the-u-s-economy-is-just-fine
"Is America still a serious country?"
Is America Still a Serious Country?
By Patrick J. Buchanan
Well, it looks like Donald Sterling will not be getting that NAACP lifetime achievement award he was set to receive at the civil rights organization’s 100th anniversary celebration in Los Angeles in May.
Allegedly, Sterling’s 30-something girlfriend, a model who goes by the name of V. Stiviano, whom Sterling’s wife of 50 years is suing, taped these remarks of the 80-year-old owner of the L.A. Clippers:
“You can sleep with [black people]. You can bring them in, you can do whatever you want. The little I ask you is not to promote it … and not to bring them to my games.
” … Don’t put him [Magic Johnson] on an Instagram for the world to have to see … and don’t bring him to my games.”
This rant of the octogenarian owner swept the canonization of Popes John XXIII and John Paul II right off of page one of the New York Times, whose headline blared:
“Amid Uproar, Clippers Silently Display Solidarity.”
The Times story told of how Clippers’ players turned their warm-up sweatshirts inside out and donned black socks and black wristbands in protest of Sterling’s remarks.
Not exactly John Lewis at Selma Bridge. And, still, the Clippers got waxed in the playoff game against the Golden State Warriors.
But the Times was not nearly done with this monstrous moral outrage, which even elicited the indignation of President Obama in Malaysia. The banner across the entire sports section of the Times read: “Vortex of Outrage Trails Clippers Owner.”
A photo of the team standing solemnly in their red warm-up suits covered half the page, and two Times’ columnists decried the horror.
Wrote Michael Powell of Sterling: He stands “exposed as a gargoyle, disgorging racial and sexual animosities so atavistic as to take the breath away.”
Finally getting his breath back, Powell went on:
“The Clippers players and coaches are no doubt mortified to have awakened in the midst of a playoff run to find that they are working for the Bull Connor of Southern California.”
But how could Sterling be the Bull Connor of California when he has a girlfriend who describes herself as black and Mexican, hired a black coach for his Clippers, Doc Rivers, and pays his players, mostly black, millions of dollars a year?
If memory serves, Bull Connor was into using fire hoses, billy clubs and German Shepherds on civil rights demonstrators in his hometown of Birmingham.
Sterling regularly sits courtside to cheer on the predominantly black team he has proudly owned for 33 years.
His rant sounds rather like an old guy mortified and humiliated at seeing his girlfriend, half his age, on TV and the Internet, making a fool of him, with black men — in public.
As for the girlfriend, or ex-girlfriend now, she allegedly taped the conversation without his knowledge, a violation of state law.
But there is apparently much more to this story than the rant, as the Times’ Billy Witz relates:
“In 2009, Sterling paid a $2.725 million settlement in a lawsuit brought by the Justice Department accusing him of systematically driving African-Americans, Latinos and families with children out of apartment buildings he owned.”
Why did the league not deal with Sterling then for an offense far more grievous than a phone call to his girlfriend to stop making a fool of him with Magic Johnson.
Former NBA great Elgin Baylor, his former general manager, charged Sterling in a lawsuit with running a “Southern plantation-type structure” as boss of the Clippers.
And Sally Jenkins of the Washington Post reports on far nastier remarks, as she writes that Sterling said of blacks in 2002 that they “smell and aren’t clean.”
“That quote,” says Jenkins, “comes from sworn testimony in a 2002 slumlording case against Sterling for discriminating against tenants, not just blacks but also Hispanics, whom he called lazy drunks, and Koreans, whom he deemed too powerless to complain, according to statements compiled by Deadspin.com.”
“Sterling’s wormy mind,” writes Jenkins, has been “common knowledge among NBA owners and executives for years, as far back as 1983 when he allegedly called his own players the N-word during a job interview with Rollie Massimino conducted while drinking champagne.”
“There is no room for Donald Sterling in our league,” says LeBron James. But that was this weekend.
Which brings us to the unanswered questions.
How did Donald Sterling get away with behavior, in a professional sports league dominated by black players, which would get a college kid kicked out of school and scarred for life? Have they no morals clause in the NBA? How was Donald Sterling voted that lifetime achievement award by the NAACP?
The answer to all likely lies in the adage: Follow the money.
Nevertheless, when nonsense like stupid racial remarks by Nevada rancher Cliven Bundy and Clippers boss Donald Sterling can consume the nation’s conversation for a full week, it does raise a far more disturbing question:
Is America still a serious country?
Link:
http://www.lewrockwell.com/2014/04/patrick-j-buchanan/the-wacko-states-of-america/
By Patrick J. Buchanan
Well, it looks like Donald Sterling will not be getting that NAACP lifetime achievement award he was set to receive at the civil rights organization’s 100th anniversary celebration in Los Angeles in May.
Allegedly, Sterling’s 30-something girlfriend, a model who goes by the name of V. Stiviano, whom Sterling’s wife of 50 years is suing, taped these remarks of the 80-year-old owner of the L.A. Clippers:
“You can sleep with [black people]. You can bring them in, you can do whatever you want. The little I ask you is not to promote it … and not to bring them to my games.
” … Don’t put him [Magic Johnson] on an Instagram for the world to have to see … and don’t bring him to my games.”
This rant of the octogenarian owner swept the canonization of Popes John XXIII and John Paul II right off of page one of the New York Times, whose headline blared:
“Amid Uproar, Clippers Silently Display Solidarity.”
The Times story told of how Clippers’ players turned their warm-up sweatshirts inside out and donned black socks and black wristbands in protest of Sterling’s remarks.
Not exactly John Lewis at Selma Bridge. And, still, the Clippers got waxed in the playoff game against the Golden State Warriors.
But the Times was not nearly done with this monstrous moral outrage, which even elicited the indignation of President Obama in Malaysia. The banner across the entire sports section of the Times read: “Vortex of Outrage Trails Clippers Owner.”
A photo of the team standing solemnly in their red warm-up suits covered half the page, and two Times’ columnists decried the horror.
Wrote Michael Powell of Sterling: He stands “exposed as a gargoyle, disgorging racial and sexual animosities so atavistic as to take the breath away.”
Finally getting his breath back, Powell went on:
“The Clippers players and coaches are no doubt mortified to have awakened in the midst of a playoff run to find that they are working for the Bull Connor of Southern California.”
But how could Sterling be the Bull Connor of California when he has a girlfriend who describes herself as black and Mexican, hired a black coach for his Clippers, Doc Rivers, and pays his players, mostly black, millions of dollars a year?
If memory serves, Bull Connor was into using fire hoses, billy clubs and German Shepherds on civil rights demonstrators in his hometown of Birmingham.
Sterling regularly sits courtside to cheer on the predominantly black team he has proudly owned for 33 years.
His rant sounds rather like an old guy mortified and humiliated at seeing his girlfriend, half his age, on TV and the Internet, making a fool of him, with black men — in public.
As for the girlfriend, or ex-girlfriend now, she allegedly taped the conversation without his knowledge, a violation of state law.
But there is apparently much more to this story than the rant, as the Times’ Billy Witz relates:
“In 2009, Sterling paid a $2.725 million settlement in a lawsuit brought by the Justice Department accusing him of systematically driving African-Americans, Latinos and families with children out of apartment buildings he owned.”
Why did the league not deal with Sterling then for an offense far more grievous than a phone call to his girlfriend to stop making a fool of him with Magic Johnson.
Former NBA great Elgin Baylor, his former general manager, charged Sterling in a lawsuit with running a “Southern plantation-type structure” as boss of the Clippers.
And Sally Jenkins of the Washington Post reports on far nastier remarks, as she writes that Sterling said of blacks in 2002 that they “smell and aren’t clean.”
“That quote,” says Jenkins, “comes from sworn testimony in a 2002 slumlording case against Sterling for discriminating against tenants, not just blacks but also Hispanics, whom he called lazy drunks, and Koreans, whom he deemed too powerless to complain, according to statements compiled by Deadspin.com.”
“Sterling’s wormy mind,” writes Jenkins, has been “common knowledge among NBA owners and executives for years, as far back as 1983 when he allegedly called his own players the N-word during a job interview with Rollie Massimino conducted while drinking champagne.”
“There is no room for Donald Sterling in our league,” says LeBron James. But that was this weekend.
Which brings us to the unanswered questions.
How did Donald Sterling get away with behavior, in a professional sports league dominated by black players, which would get a college kid kicked out of school and scarred for life? Have they no morals clause in the NBA? How was Donald Sterling voted that lifetime achievement award by the NAACP?
The answer to all likely lies in the adage: Follow the money.
Nevertheless, when nonsense like stupid racial remarks by Nevada rancher Cliven Bundy and Clippers boss Donald Sterling can consume the nation’s conversation for a full week, it does raise a far more disturbing question:
Is America still a serious country?
Link:
http://www.lewrockwell.com/2014/04/patrick-j-buchanan/the-wacko-states-of-america/
"Like with our deteriorating educational system, our economy no longer measures up to previous standards of performance. In education, you can see the difference through comparison to a century old Jr. High School test that I believe would stymie most of today’s college graduates. Our economic deterioration can be seen in our high trade deficits, big budget deficits, high public and private debt levels and the explosion in the number of people who rely on government assistance be it in the form of welfare, food stamps, or disability."
The Debate Debate
By Peter Schiff
While there is wide agreement that the cost of college education has risen far faster than the incomes of most Americans, there is some debate as to whether the quality of the product has kept pace with the price. Not surprisingly, almost all who argue that it has (college administrators, professors, and populist politicians) are deeply invested either ideologically or financially in the system itself. More objective observers see a bureaucratic, inefficient, and hopelessly out of touch ivory tower that is bleeding the country of its savings, and more tragically, its intellectual acuity.
Nowhere is this more clearly illustrated than in the demise of collegiate debate. This once courtly rhetorical sparring ground for class presidents and lawyers-in-training is supposed to be forum for ideas, proofs, and conclusions. And while traditional debates did not typically offer high drama, they did teach students how to produce objectively superior arguments, a skill that many types of potential employers would value. But more recently, debate has succumbed to the worst aspects of moral relativism, academic sloth and politically correct dogma that have transformed it into an unintelligible mix of performance art and petty politics. It’s not a debate, but we pretend it is.
The 2014 National Championship of the Cross Examination Debate Association (CEDA), one of collegiate debate’s governing bodies, made headlines as the first to include two all-African-American finalist teams. The winning team, from Towson University in Maryland, was the first ever comprised solely of African-American women. The results were heralded as a triumph for minority achievement in a field traditionally dominated by white “elites.” But this success has come at a great cost: A dramatic change in the rules of the game. The championships, as well as dozens of the CEDA sanctioned debates and championships, are easily found on YouTube. I challenge anyone to watch any of those “debates” and describe the ideas and arguments that participants are supposedly addressing.
At this year’s championship, the actual debate question concerned the wisdom of restricting the war powers of the U.S. president. But instead of addressing one of the most important U.S. foreign policy questions of the past half century, the two teams focused exclusively on how the U.S. was supposedly “at war” with poor black people. Although these arguments were clearly off-subject, it seems that the topic did not matter. The “debate” came off as a mix of rap, personal invective, speed talking, soapbox harangue, and explicative filled rants. When one contestant’s time expired, he “brilliantly” yelled “F-ck the time!” As was the case in 2013, when another African American team took the championship, the arguments of the winners completely ignored the stated resolution, and instead used personal experience to challenge the “injustice” of the very notion of debate itself. But subjective arguments have been traditionally dismissed as poor rhetoric. “I won the lottery” is not a good argument in favor of the lottery system.
But in recent years, logic and objective analysis have come to be considered “white” concepts. In an Atlantic Monthly article (Apr. 16), Osagie Obasogie, a liberal law professor from University of California, is quoted as saying “Various procedures – regardless of whether we’re talking about debate formats or law- have the ability to hide the subjective experiences that shape these seemingly ‘objective’ and ‘rational’ rules. …This is the power of racial subordination: making the viewpoint of the dominant group seem like the only true reality.” In other words debates, like much in society, was devised by white people to favor white people. This idea, which is the essence of affirmative action, may make professors and students feel good about themselves, but it simply means that minorities have license to underachieve.
Creating an alternate set of rules for people of different backgrounds creates huge problems. What would have happened to Venus and Serena Williams had tennis officials drew up a special set of rules for them to compensate for their background? While they may have won more tournaments, they would not have been pushed to achieve their true potential and their victories would have been empty achievements. While it’s true that they faced more obstacles than privileged players from the suburbs, changing the rules to allow for their subjective experiences would have prevented their ultimate success.
That is exactly what is happening today, not just in debate tournaments, but across universities in general. Excuses are being made and rules are being bent in order to account for our personal differences, race, gender and sexual orientation in particular. This trend is producing a generation of marginally skilled, professionally unprepared graduates. The poor quality of our higher education means that we can’t compete with other nations who insist on educating their young people through “objective” and “oppressive” systems. This can also be said of our economy. Dumbed down and subjective criteria allow us to pretend that our economy is growing even as living standards are falling, the labor force is shrinking, savings are evaporating, and opportunity is more and more elusive. Rather than admit the obvious, that we have a remedial economy, we have consistently redefined success downward with revisions to tools we use to measure our economy like GDP, inflation and unemployment. See a deeper analysis of this trend in may latest special report, Taxed By Debt.
Like with our deteriorating educational system, our economy no longer measures up to previous standards of performance. In education, you can see the difference through comparison to a century old Jr. High School test that I believe would stymie most of today’s college graduates. Our economic deterioration can be seen in our high trade deficits, big budget deficits, high public and private debt levels and the explosion in the number of people who rely on government assistance be it in the form of welfare, food stamps, or disability.
However, according to many economists, none of this is cause for concern as it is simply the way things work in our new “consumer-based,” “service-sector,” economy. Instead of growth through savings, capital investment, and production, we now rely on money printing, asset bubbles, leverage, and consumer credit. Inflation, which was once acknowledged as being bad, is now considered good. Persistent trade deficits, once a sign of economic distress, are now considered signs of strong domestic demand. Instead of dealing painfully with intractable problems, we have redefined our liabilities as assets and declared victory.
In the end, will awarding debate championships to undisciplined, barely comprehensible minority students really help these individuals succeed in life? No law firm or corporation will look to hire debate winners as the competitions have now lost all relevance. Similarly, dumbing down standards to whitewash our poor economy performance will only worsen our problems. Fortunately the Supreme Court last week, with its decision to support Michigan’s campaign to end race-based selection practices at state universities, took a tiny step in dismantling this lunacy. But we must be on the lookout for much lower profile aspects of the same confrontation. The front lines are everywhere.
Link:
http://www.lewrockwell.com/2014/04/peter-schiff/forget-college-2/
By Peter Schiff
While there is wide agreement that the cost of college education has risen far faster than the incomes of most Americans, there is some debate as to whether the quality of the product has kept pace with the price. Not surprisingly, almost all who argue that it has (college administrators, professors, and populist politicians) are deeply invested either ideologically or financially in the system itself. More objective observers see a bureaucratic, inefficient, and hopelessly out of touch ivory tower that is bleeding the country of its savings, and more tragically, its intellectual acuity.
Nowhere is this more clearly illustrated than in the demise of collegiate debate. This once courtly rhetorical sparring ground for class presidents and lawyers-in-training is supposed to be forum for ideas, proofs, and conclusions. And while traditional debates did not typically offer high drama, they did teach students how to produce objectively superior arguments, a skill that many types of potential employers would value. But more recently, debate has succumbed to the worst aspects of moral relativism, academic sloth and politically correct dogma that have transformed it into an unintelligible mix of performance art and petty politics. It’s not a debate, but we pretend it is.
The 2014 National Championship of the Cross Examination Debate Association (CEDA), one of collegiate debate’s governing bodies, made headlines as the first to include two all-African-American finalist teams. The winning team, from Towson University in Maryland, was the first ever comprised solely of African-American women. The results were heralded as a triumph for minority achievement in a field traditionally dominated by white “elites.” But this success has come at a great cost: A dramatic change in the rules of the game. The championships, as well as dozens of the CEDA sanctioned debates and championships, are easily found on YouTube. I challenge anyone to watch any of those “debates” and describe the ideas and arguments that participants are supposedly addressing.
At this year’s championship, the actual debate question concerned the wisdom of restricting the war powers of the U.S. president. But instead of addressing one of the most important U.S. foreign policy questions of the past half century, the two teams focused exclusively on how the U.S. was supposedly “at war” with poor black people. Although these arguments were clearly off-subject, it seems that the topic did not matter. The “debate” came off as a mix of rap, personal invective, speed talking, soapbox harangue, and explicative filled rants. When one contestant’s time expired, he “brilliantly” yelled “F-ck the time!” As was the case in 2013, when another African American team took the championship, the arguments of the winners completely ignored the stated resolution, and instead used personal experience to challenge the “injustice” of the very notion of debate itself. But subjective arguments have been traditionally dismissed as poor rhetoric. “I won the lottery” is not a good argument in favor of the lottery system.
But in recent years, logic and objective analysis have come to be considered “white” concepts. In an Atlantic Monthly article (Apr. 16), Osagie Obasogie, a liberal law professor from University of California, is quoted as saying “Various procedures – regardless of whether we’re talking about debate formats or law- have the ability to hide the subjective experiences that shape these seemingly ‘objective’ and ‘rational’ rules. …This is the power of racial subordination: making the viewpoint of the dominant group seem like the only true reality.” In other words debates, like much in society, was devised by white people to favor white people. This idea, which is the essence of affirmative action, may make professors and students feel good about themselves, but it simply means that minorities have license to underachieve.
Creating an alternate set of rules for people of different backgrounds creates huge problems. What would have happened to Venus and Serena Williams had tennis officials drew up a special set of rules for them to compensate for their background? While they may have won more tournaments, they would not have been pushed to achieve their true potential and their victories would have been empty achievements. While it’s true that they faced more obstacles than privileged players from the suburbs, changing the rules to allow for their subjective experiences would have prevented their ultimate success.
That is exactly what is happening today, not just in debate tournaments, but across universities in general. Excuses are being made and rules are being bent in order to account for our personal differences, race, gender and sexual orientation in particular. This trend is producing a generation of marginally skilled, professionally unprepared graduates. The poor quality of our higher education means that we can’t compete with other nations who insist on educating their young people through “objective” and “oppressive” systems. This can also be said of our economy. Dumbed down and subjective criteria allow us to pretend that our economy is growing even as living standards are falling, the labor force is shrinking, savings are evaporating, and opportunity is more and more elusive. Rather than admit the obvious, that we have a remedial economy, we have consistently redefined success downward with revisions to tools we use to measure our economy like GDP, inflation and unemployment. See a deeper analysis of this trend in may latest special report, Taxed By Debt.
Like with our deteriorating educational system, our economy no longer measures up to previous standards of performance. In education, you can see the difference through comparison to a century old Jr. High School test that I believe would stymie most of today’s college graduates. Our economic deterioration can be seen in our high trade deficits, big budget deficits, high public and private debt levels and the explosion in the number of people who rely on government assistance be it in the form of welfare, food stamps, or disability.
However, according to many economists, none of this is cause for concern as it is simply the way things work in our new “consumer-based,” “service-sector,” economy. Instead of growth through savings, capital investment, and production, we now rely on money printing, asset bubbles, leverage, and consumer credit. Inflation, which was once acknowledged as being bad, is now considered good. Persistent trade deficits, once a sign of economic distress, are now considered signs of strong domestic demand. Instead of dealing painfully with intractable problems, we have redefined our liabilities as assets and declared victory.
In the end, will awarding debate championships to undisciplined, barely comprehensible minority students really help these individuals succeed in life? No law firm or corporation will look to hire debate winners as the competitions have now lost all relevance. Similarly, dumbing down standards to whitewash our poor economy performance will only worsen our problems. Fortunately the Supreme Court last week, with its decision to support Michigan’s campaign to end race-based selection practices at state universities, took a tiny step in dismantling this lunacy. But we must be on the lookout for much lower profile aspects of the same confrontation. The front lines are everywhere.
Link:
http://www.lewrockwell.com/2014/04/peter-schiff/forget-college-2/
"Stated differently, the overwhelming bulk of the 600,000 so-called “bridges” in America are so little used that the are more often crossed by dogs, cows, cats and tractors than they are by passenger motorists. They are essentially no different than local playgrounds and municipal parks. They have nothing to do with interstate commerce, GDP growth or national public infrastructure."
The Madison County Bridges In Nowhere And The Perennial Myth Of Crumbling Infrastructure
By David Stockman
Whenever the beltway bandits run low on excuses to run-up the national debt they trot out florid tales of crumbling infrastructure—that is, dilapidated roads, collapsing bridges, failing water and sewer systems, inadequate rail and public transit and the rest. This is variously alleged to represent a national disgrace, an impediment to economic growth and a sensible opportunity for fiscal “stimulus”.
But most especially it presents a swell opportunity for Washington to create millions of “jobs”. And, according to the Obama Administration’s latest incarnation of this age old canard, it can be done in a fiscally responsible manner through the issuance of “green ink” bonds by a national infrastructure bank, not “red ink” bonds by the US Treasury. The implication, of course, is that borrowings incurred to repair the nation’s allegedly “collapsing” infrastructure would be a form of “self-liquidating” debt. That is, these “infrastructure” projects would eventually pay for themselves in the form of enhanced national economic growth and efficiency.
Except that the evidence for dilapidated infrastructure is just bogus beltway propaganda cynically peddled by the construction and builder lobbies. Moreover, the infrastructure that actually does qualify for self-liquidating investment is overwhelmingly local in nature—-urban highways, metropolitan water and sewer systems, airports. These should be funded by users fees and levies on local taxpayers—not financed with Washington issued bonds and pork-barreled through its wasteful labyrinth of earmarks and plunder.
Nowhere is the stark distinction between the crumbling infrastructure myth and the factual reality more evident than in the case of the so-called deficient and obsolete bridges. To hear the K-Street lobbies tell it—-motorist all across American are at risk for plunging into the drink at any time owing to defective bridges.
Even Ronald Reagan fell for that one. During the long trauma of the 1981-1982 recession the Reagan Administration had stoutly resisted the temptation to implement a Keynesian style fiscal stimulus and jobs program–notwithstanding an unemployment rate that peaked in double digits. But within just a few months of the bottom, along came a Republican Secretary of Transportation, Drew Lewis, with a Presidential briefing on the alleged disrepair of the nation’s highways and bridges. The briefing was accompanied by a Cabinet Room full of easels bearing pictures of dilapidated bridges and roads and a plan to dramatically increase highway spending and the gas tax.
Not surprisingly, DOT Secretary Drew Lewis was a former governor and the top GOP fundraiser of the era. So the Cabinet Room was soon figuratively surrounded by a muscular coalition of road builders, construction machinery suppliers, asphalt and concrete vendors, governors, mayors and legislators and the AFL-CIO building trades department. And if that wasn’t enough, Lewis had also made deals to line up the highway safety and beautification lobby, bicycle enthusiasts and all the motley array of mass transit interest groups.
They were all singing from the same crumbling infrastructure playbook. As Lewis summarized, “We have highways and bridges that are falling down around our ears—that’s really the thrust of the program.”
The Gipper soon joined the crowd. “No, we are opposed to wasteful borrow and spend”, he recalled,”that’s how we got into this mess. But these projects are different. Roads and bridges are a proper responsibility of government, and they have already been paid for by the gas tax”.
By the time a pork-laden highway bill was rammed through a lame duck session of Congress in December 1982, Reagan too had bought on to the crumbling infrastructure gambit. Explaining why he signed the bill, the scourge of Big Government noted, “We have 23,000 bridges in need of replacement or rehabilitation; 40 percent of our bridges are over 40 years old.”
Still, this massive infrastructure spending bill that busted a budget already bleeding $200 billion of red ink was not to be confused with a capitulation to Keynesian fiscal stimulus. Instead, as President Reagan explained to the press when asked whether it was a tax bill, jobs bill or anti-recession stimulus, it was just an exercise in prudent governance: “There will be some employment with it, but its not a jobs bill as such. It is a necessity…..(based) on the user fee principle–those who benefit from a use should share its cost”.
Needless to say, none of that was remotely true. Twenty percent of the nickel/gallon gas tax increase went to mass transit, thereby breeching the “user fee” principle at the get-go, and paving the way for endless diversion of gas taxes to non-highway uses. Indeed, today an estimated 40% of highway trust fund revenues go to mass transit, bicycle paths and sundry other earmarks and diversions.
More importantly, less than one-third of the $30 billion authorized by the 1982 bill went to the Interstate Highway System—the ostensible user fee based national infrastructure investment. All the rest went to what are inherently local/regional projects—-state highways, primary and secondary roads, buses, and mass transit facilities.
And this is where the tale of Madison County bridges to nowhere comes in; and also where the principle that local users and taxpayers should fund local infrastructure could not be more strikingly illustrated.
It seems that after 32 years and tens of billions of Federal funding that the nations bridges are still crumbling and in grave disrepair. In fact, according to DOT and the industry lobbies there are 63,000 bridges across the nation that are “structurally deficient”, suggesting that millions of motorists are at risk for a perilous dive into the drink.
But here’s the thing. Roughly one-third or 20,000 of these purportedly hazardous bridges are located in six rural states in America’s mid-section: Iowa, Oklahoma, Missouri, Kansas, Nebraska and South Dakota. The fact that these states account for only 5.9% of the nation’s population seems more than a little incongruous but that isn’t even half the puzzle. It seems that these thinly populated country provinces have a grand total of 118,000 bridges. That is, one bridge for every 160 citizens—men, women and children included.
And the biggest bridge state among them is, well, yes, Iowa. The state has 3 million souls and nearly 25,000 bridges–one for every 125 people. So suddenly the picture is crystal clear. These are not the kind of bridges that thousands of cars and heavy duty trucks pass over each day. No, they are mainly the kind Clint Eastwood needed a local farm-wife to locate—so he could take pictures for a National Geographic spread on covered bridges.
Stated differently, the overwhelming bulk of the 600,000 so-called “bridges” in America are so little used that the are more often crossed by dogs, cows, cats and tractors than they are by passenger motorists. They are essentially no different than local playgrounds and municipal parks. They have nothing to do with interstate commerce, GDP growth or national public infrastructure.
If they are structurally “deficient” as measured by engineering standards that is not exactly a mystery to the host village, township and county governments which choose not to upgrade them. So if Iowa is content to live with 5,000 bridges—one in five of its 25,000 bridges— that are deemed structurally deficient by DOT, why is this a national crisis? Self-evidently, the electorate and officialdom of Iowa do not consider these bridges to be a public safety hazard or something would have been done long ago.
The evidence for that is in another startling “fun fact” about the nation’s bridges. Compared to the 19,000 so-called “structurally deficient” bridges in the six rural states reviewed here, there are also 19,000 such deficient bridges in another group of 35 states–including Texas, Maryland, Massachusetts, Virginia, Washington, Oregon, Michigan, Arizona, Colorado, Florida, New Jersey and Wisconsin, among others. But these states have a combined population of 175 million not 19 million as in the six rural states; and more than 600 citizens per bridge, not 125 as in Iowa.
Moreover, only 7% of the bridges in these 35 states are considered to be structurally deficient rather than 21% as in Iowa. So the long and short of it is self-evident: Iowa still has a lot of one-horse bridges and Massachusetts— with 1,300 citizens per “bridge”— does not. None of this is remotely relevant to a national infrastructure crisis today—any more than it was in 1982 when even Ronald Reagan fell for “23,000 bridges in need of replacement or rehabilitation”.
Yes, the few thousands of bridges actually used heavily in commerce and passenger transportation in American do fall into disrepair and need periodic reinvestment. But the proof that even this is an overwhelmingly state and local problem is evident in another list maintained by the DOT.
That list would be a rank ordering called “The Most Travelled Structurally Deficient Bridges, 2013″. These are the opposite of the covered bridges of Madison County, but even here there is a cautionary tale. It seems that of the 100 most heavily traveled bridges in the US by rank order, and which are in need of serious repair, 80% of them are in California!
Moreover, they are overwhelmingly state highway and municipal road and street bridges located in Los Angeles, Orange County and the Inland Empire. Stated differently, Governor Moonbeam has not miraculously solved California endemic fiscal crisis; he’s just neglected the local infrastructure. There is no obvious reasons why taxpayers in Indiana or North Carolina needed to be fixing California’s bridges— so that it can continue to finance its outrageously costly public employee pension system.
And so it goes with the rest of the so-called infrastructure slate. There is almost nothing there that is truly national in scope and little that is in a state of crumbling and crisis.
Indeed, the one national asset—the Interstate Highway System—is generally in such good shape that most of the “shovel ready” projects on it during the Obama stimulus turned out to be resurfacing projects that were not yet needed and would have been done in the ordinary course anyway, and the construction of new over-passes for lightly traveled country roads that have happily been dead-ends for decades.
One thing is clear. The is no case for adding to our staggering $17 trillion national debt in order to replace the bridges of Madison county; or to fix state and local highways or build white elephant high speed rail systems; or to relieve air travelers of paying user fees to upgrade local airports or local taxpayers of their obligation to pay fees and taxes to maintain their water and sewer systems.
At the end of the day, the ballyhooed national infrastructure crisis is a beltway racket of the first order. It has been for decades.
Here is the bridge data in all its splendid detail!
Read the rest here:
http://www.lewrockwell.com/2014/04/david-stockman/crumbling-infrastructure/
By David Stockman
Whenever the beltway bandits run low on excuses to run-up the national debt they trot out florid tales of crumbling infrastructure—that is, dilapidated roads, collapsing bridges, failing water and sewer systems, inadequate rail and public transit and the rest. This is variously alleged to represent a national disgrace, an impediment to economic growth and a sensible opportunity for fiscal “stimulus”.
But most especially it presents a swell opportunity for Washington to create millions of “jobs”. And, according to the Obama Administration’s latest incarnation of this age old canard, it can be done in a fiscally responsible manner through the issuance of “green ink” bonds by a national infrastructure bank, not “red ink” bonds by the US Treasury. The implication, of course, is that borrowings incurred to repair the nation’s allegedly “collapsing” infrastructure would be a form of “self-liquidating” debt. That is, these “infrastructure” projects would eventually pay for themselves in the form of enhanced national economic growth and efficiency.
Except that the evidence for dilapidated infrastructure is just bogus beltway propaganda cynically peddled by the construction and builder lobbies. Moreover, the infrastructure that actually does qualify for self-liquidating investment is overwhelmingly local in nature—-urban highways, metropolitan water and sewer systems, airports. These should be funded by users fees and levies on local taxpayers—not financed with Washington issued bonds and pork-barreled through its wasteful labyrinth of earmarks and plunder.
Nowhere is the stark distinction between the crumbling infrastructure myth and the factual reality more evident than in the case of the so-called deficient and obsolete bridges. To hear the K-Street lobbies tell it—-motorist all across American are at risk for plunging into the drink at any time owing to defective bridges.
Even Ronald Reagan fell for that one. During the long trauma of the 1981-1982 recession the Reagan Administration had stoutly resisted the temptation to implement a Keynesian style fiscal stimulus and jobs program–notwithstanding an unemployment rate that peaked in double digits. But within just a few months of the bottom, along came a Republican Secretary of Transportation, Drew Lewis, with a Presidential briefing on the alleged disrepair of the nation’s highways and bridges. The briefing was accompanied by a Cabinet Room full of easels bearing pictures of dilapidated bridges and roads and a plan to dramatically increase highway spending and the gas tax.
Not surprisingly, DOT Secretary Drew Lewis was a former governor and the top GOP fundraiser of the era. So the Cabinet Room was soon figuratively surrounded by a muscular coalition of road builders, construction machinery suppliers, asphalt and concrete vendors, governors, mayors and legislators and the AFL-CIO building trades department. And if that wasn’t enough, Lewis had also made deals to line up the highway safety and beautification lobby, bicycle enthusiasts and all the motley array of mass transit interest groups.
They were all singing from the same crumbling infrastructure playbook. As Lewis summarized, “We have highways and bridges that are falling down around our ears—that’s really the thrust of the program.”
The Gipper soon joined the crowd. “No, we are opposed to wasteful borrow and spend”, he recalled,”that’s how we got into this mess. But these projects are different. Roads and bridges are a proper responsibility of government, and they have already been paid for by the gas tax”.
By the time a pork-laden highway bill was rammed through a lame duck session of Congress in December 1982, Reagan too had bought on to the crumbling infrastructure gambit. Explaining why he signed the bill, the scourge of Big Government noted, “We have 23,000 bridges in need of replacement or rehabilitation; 40 percent of our bridges are over 40 years old.”
Still, this massive infrastructure spending bill that busted a budget already bleeding $200 billion of red ink was not to be confused with a capitulation to Keynesian fiscal stimulus. Instead, as President Reagan explained to the press when asked whether it was a tax bill, jobs bill or anti-recession stimulus, it was just an exercise in prudent governance: “There will be some employment with it, but its not a jobs bill as such. It is a necessity…..(based) on the user fee principle–those who benefit from a use should share its cost”.
Needless to say, none of that was remotely true. Twenty percent of the nickel/gallon gas tax increase went to mass transit, thereby breeching the “user fee” principle at the get-go, and paving the way for endless diversion of gas taxes to non-highway uses. Indeed, today an estimated 40% of highway trust fund revenues go to mass transit, bicycle paths and sundry other earmarks and diversions.
More importantly, less than one-third of the $30 billion authorized by the 1982 bill went to the Interstate Highway System—the ostensible user fee based national infrastructure investment. All the rest went to what are inherently local/regional projects—-state highways, primary and secondary roads, buses, and mass transit facilities.
And this is where the tale of Madison County bridges to nowhere comes in; and also where the principle that local users and taxpayers should fund local infrastructure could not be more strikingly illustrated.
It seems that after 32 years and tens of billions of Federal funding that the nations bridges are still crumbling and in grave disrepair. In fact, according to DOT and the industry lobbies there are 63,000 bridges across the nation that are “structurally deficient”, suggesting that millions of motorists are at risk for a perilous dive into the drink.
But here’s the thing. Roughly one-third or 20,000 of these purportedly hazardous bridges are located in six rural states in America’s mid-section: Iowa, Oklahoma, Missouri, Kansas, Nebraska and South Dakota. The fact that these states account for only 5.9% of the nation’s population seems more than a little incongruous but that isn’t even half the puzzle. It seems that these thinly populated country provinces have a grand total of 118,000 bridges. That is, one bridge for every 160 citizens—men, women and children included.
And the biggest bridge state among them is, well, yes, Iowa. The state has 3 million souls and nearly 25,000 bridges–one for every 125 people. So suddenly the picture is crystal clear. These are not the kind of bridges that thousands of cars and heavy duty trucks pass over each day. No, they are mainly the kind Clint Eastwood needed a local farm-wife to locate—so he could take pictures for a National Geographic spread on covered bridges.
Stated differently, the overwhelming bulk of the 600,000 so-called “bridges” in America are so little used that the are more often crossed by dogs, cows, cats and tractors than they are by passenger motorists. They are essentially no different than local playgrounds and municipal parks. They have nothing to do with interstate commerce, GDP growth or national public infrastructure.
If they are structurally “deficient” as measured by engineering standards that is not exactly a mystery to the host village, township and county governments which choose not to upgrade them. So if Iowa is content to live with 5,000 bridges—one in five of its 25,000 bridges— that are deemed structurally deficient by DOT, why is this a national crisis? Self-evidently, the electorate and officialdom of Iowa do not consider these bridges to be a public safety hazard or something would have been done long ago.
The evidence for that is in another startling “fun fact” about the nation’s bridges. Compared to the 19,000 so-called “structurally deficient” bridges in the six rural states reviewed here, there are also 19,000 such deficient bridges in another group of 35 states–including Texas, Maryland, Massachusetts, Virginia, Washington, Oregon, Michigan, Arizona, Colorado, Florida, New Jersey and Wisconsin, among others. But these states have a combined population of 175 million not 19 million as in the six rural states; and more than 600 citizens per bridge, not 125 as in Iowa.
Moreover, only 7% of the bridges in these 35 states are considered to be structurally deficient rather than 21% as in Iowa. So the long and short of it is self-evident: Iowa still has a lot of one-horse bridges and Massachusetts— with 1,300 citizens per “bridge”— does not. None of this is remotely relevant to a national infrastructure crisis today—any more than it was in 1982 when even Ronald Reagan fell for “23,000 bridges in need of replacement or rehabilitation”.
Yes, the few thousands of bridges actually used heavily in commerce and passenger transportation in American do fall into disrepair and need periodic reinvestment. But the proof that even this is an overwhelmingly state and local problem is evident in another list maintained by the DOT.
That list would be a rank ordering called “The Most Travelled Structurally Deficient Bridges, 2013″. These are the opposite of the covered bridges of Madison County, but even here there is a cautionary tale. It seems that of the 100 most heavily traveled bridges in the US by rank order, and which are in need of serious repair, 80% of them are in California!
Moreover, they are overwhelmingly state highway and municipal road and street bridges located in Los Angeles, Orange County and the Inland Empire. Stated differently, Governor Moonbeam has not miraculously solved California endemic fiscal crisis; he’s just neglected the local infrastructure. There is no obvious reasons why taxpayers in Indiana or North Carolina needed to be fixing California’s bridges— so that it can continue to finance its outrageously costly public employee pension system.
And so it goes with the rest of the so-called infrastructure slate. There is almost nothing there that is truly national in scope and little that is in a state of crumbling and crisis.
Indeed, the one national asset—the Interstate Highway System—is generally in such good shape that most of the “shovel ready” projects on it during the Obama stimulus turned out to be resurfacing projects that were not yet needed and would have been done in the ordinary course anyway, and the construction of new over-passes for lightly traveled country roads that have happily been dead-ends for decades.
One thing is clear. The is no case for adding to our staggering $17 trillion national debt in order to replace the bridges of Madison county; or to fix state and local highways or build white elephant high speed rail systems; or to relieve air travelers of paying user fees to upgrade local airports or local taxpayers of their obligation to pay fees and taxes to maintain their water and sewer systems.
At the end of the day, the ballyhooed national infrastructure crisis is a beltway racket of the first order. It has been for decades.
Here is the bridge data in all its splendid detail!
Read the rest here:
http://www.lewrockwell.com/2014/04/david-stockman/crumbling-infrastructure/
Tuesday, April 29, 2014
I agree...
Obama the Golfer
Laurence M. Vance
Earlier this year it was reported that President Obama had played his 160th round of golf. He has played golf an average of once every 11 days in office. Conservatives are, of course, upset. I say good for him. Better he is occupied with golf than with ordering drone strikes. Too bad Bush didn’t play more golf.
Link:
http://www.lewrockwell.com/lrc-blog/obama-the-golfer/
Laurence M. Vance
Earlier this year it was reported that President Obama had played his 160th round of golf. He has played golf an average of once every 11 days in office. Conservatives are, of course, upset. I say good for him. Better he is occupied with golf than with ordering drone strikes. Too bad Bush didn’t play more golf.
Link:
http://www.lewrockwell.com/lrc-blog/obama-the-golfer/
And the beat goes on...
CDC: One in 13 Children Taking Psych Meds
Written by Raven Clabough
Data from the Centers for Disease Control reveal that there continues to be a significant increase in the number of school-age children on psychiatric medications to treat emotional or behavioral problems. A new health study shows that 7.5 percent of children between the ages of 6 and 17 are on psych meds based on data collected from interviews between 2011 and 2012 with parents of over 17,000 children.
"Over the past two decades, the use of medication to treat mental health problems has increased substantially among all school-aged children and in most subgroups of children," the report's authors explained.
Unfortunately, the survey did not identify which diagnoses were being treated by the medications, but estimates indicate that a majority of the drugs are to treat ADHD symptoms, a point that critics are likely to seize upon. As noted by the UPI, "The study may lend credence to critics who say America's children are over-diagnosed with ADHD — and subsequently over-prescribed and over-medicated."
According to the American Psychiatric Association, five percent of American children have ADHD, but studies reveal more than 11 percent of American children are diagnosed with the condition.
What may be more alarming is that there is increasing evidence that ADHD may not be the epidemic that some are claiming, and in fact, may not even be an actual condition.
Dr. Richard Saul, who has been practicing behavioral neurology for 50 years, and is the author of the new book ADHD Does Not Exist, writes in a March 14 Time piece,
The fifth edition of the DSM [Diagnostic and Statistical Manual of Mental Disorders] only requires one to exhibit five of 18 possible symptoms to qualify for an ADHD diagnosis. If you haven’t seen the list, look it up. It will probably bother you. How many of us can claim that we have difficulty with organization or a tendency to lose things; that we are frequently forgetful or distracted or fail to pay close attention to details? Under these subjective criteria, the entire U.S. population could potentially qualify.
Saul's analysis confirms what critics have been saying regarding the growth in the rate of mental illness issues: that it may in fact be the result of expanded medical terms and definitions.
Slate.com warned of such a thing last April:
Beware the DSM-5, the soon-to-be-released fifth edition of the "psychiatric bible," the Diagnostic and Statistical Manual. The odds will probably be greater than 50 percent, according to the new manual, that you’ll have a mental disorder in your lifetime.
Although fewer than 6 percent of American adults will have a severe mental illness in a given year, according to a 2005 study, many more — more than a quarter each year — will have some diagnosable mental disorder. That’s a lot of people. Almost 50 percent of Americans (46.4 percent to be exact) will have a diagnosable mental illness in their lifetimes, based on the previous edition, the DSM-IV. And the new manual will likely make it even "easier" to get a diagnosis.
The expanded definitions have resulted in significant increases in diagnoses of mental disorders, particularly ADHD. Dr. Saul writes, "The New York Times reported that from 2008 to 2012 the number of adults taking medications for ADHD increased by 53% and that among young American adults, it nearly doubled."
But Saul contends that the diagnosis of ADHD overlooks the real problems. As Kyle Smith at the New York Post reports:
One girl he [Saul] treated, it turned out, was being disruptive in class because she couldn’t see the blackboard. Correct diagnois [sic]: myopia. She needed glasses, not drugs.
A 36-year-old man who complained about his addiction to online games and guessed he had ADHD, it turned out, was drinking too much coffee and sleeping only four to five hours a night. Correct diagnosis: sleep deprivation. He needed blackout shades, a white-noise machine and a program that shut all his devices off at midnight.
Smith concluded, "One by one, nearly all of Saul’s patients turned out to have some disease other than ADHD, such as Tourette’s, OCD, fragile X syndrome (a genetic mutation linked to mental retardation), autism, fetal alcohol syndrome, learning disabilities or such familiar conditions as substance abuse, poor hearing or even giftedness."
Some believe that the increase in these prescriptions results from parents looking to find an easy solution to their children's behavioral problems.
"There’s a societal trend to look for the quick fix, the magic bullet that will correct disruptive behaviors," said David Rubin, M.D., associate professor of pediatrics at the Perelman School of Medicine at the University of Pennsylvania in Philadelphia. "But for those looking for a quick solution to escalating behaviors at home, the hard truth is there is unlikely to be a quick fix."
Psychologist and parenting columnist John Rosemond echoes this sentiment, asserting that childhood misbehavior resulting from lack of discipline is incorrectly diagnosed as ADHD. As such, it is turning a discipline problem into a psychological problem.
Consumer Reports indicates, "Doctors are prescribing antipsychotics even though there’s minimal evidence that the drugs help kids for approved uses, much less the unapproved ones, such as behavioral problems. And to make matters worse, the little research there is suggests the drugs can cause troubling side effects, including weight gain, high cholesterol, and an increased risk of type-2 diabetes."
Consumer Reports also notes that the increase in the prescribing of antipsychotics can be attributed to several other factors, including aggressive drug marketing that overhypes the benefits of the pharmaceuticals and downplays their risks:
Antipsychotics have become huge moneymakers for the drug industry. In 2003, annual U.S. sales of the drugs were estimated at $2.8 billion; by 2011, that number had risen to $18.2 billion. That huge growth was driven in part by one company — Janssen Pharmaceuticals — and its aggressive promotion of off-label uses in children and elderly patients, relying on marketing tactics that according to the federal government, crossed legal and ethical lines.
And the NCHS study reports that children from poorer families are more likely to be medicated, a point observed by Dr. Rubin last year.
"Use is really high among kids in the Medicaid system where decent non-drug services may be difficult to find," says Rubin, who also points out that even kids with private insurance often don’t have coverage for psychiatric care or counseling.
Research shows that doctors are prescribing the drugs for "off-label" uses such as for ADHD and other diagnosed behavioral problems, which involve a significantly higher percentage of children than schizophrenia and bipolar disorder.
"What started out as a treatment with some level of evidence for a small sub-group of youth with significant development disabilities ... has been extended to cognitively normal kids without any strong evidence," Rubin said.
Critics observe that the side effects of anti-psychotic drugs could worsen — or in some cases actually cause — symptoms of mental illness.
According to the Citizens Commission on Human Rights International (CCHRINT), there is abundant evidence proving a connection between psychotropic medications and violent crimes, and government officials are well aware of the connection: “Between 2004 and 2011, there have been over 11,000 reports to the U.S. FDA’s MedWatch system of psychiatric drug side effects related to violence,” says CCHRINT, including 300 homicides.
Link:
http://www.thenewamerican.com/usnews/item/18145-cdc-one-in-13-children-taking-psych-meds
Written by Raven Clabough
Data from the Centers for Disease Control reveal that there continues to be a significant increase in the number of school-age children on psychiatric medications to treat emotional or behavioral problems. A new health study shows that 7.5 percent of children between the ages of 6 and 17 are on psych meds based on data collected from interviews between 2011 and 2012 with parents of over 17,000 children.
"Over the past two decades, the use of medication to treat mental health problems has increased substantially among all school-aged children and in most subgroups of children," the report's authors explained.
Unfortunately, the survey did not identify which diagnoses were being treated by the medications, but estimates indicate that a majority of the drugs are to treat ADHD symptoms, a point that critics are likely to seize upon. As noted by the UPI, "The study may lend credence to critics who say America's children are over-diagnosed with ADHD — and subsequently over-prescribed and over-medicated."
According to the American Psychiatric Association, five percent of American children have ADHD, but studies reveal more than 11 percent of American children are diagnosed with the condition.
What may be more alarming is that there is increasing evidence that ADHD may not be the epidemic that some are claiming, and in fact, may not even be an actual condition.
Dr. Richard Saul, who has been practicing behavioral neurology for 50 years, and is the author of the new book ADHD Does Not Exist, writes in a March 14 Time piece,
The fifth edition of the DSM [Diagnostic and Statistical Manual of Mental Disorders] only requires one to exhibit five of 18 possible symptoms to qualify for an ADHD diagnosis. If you haven’t seen the list, look it up. It will probably bother you. How many of us can claim that we have difficulty with organization or a tendency to lose things; that we are frequently forgetful or distracted or fail to pay close attention to details? Under these subjective criteria, the entire U.S. population could potentially qualify.
Saul's analysis confirms what critics have been saying regarding the growth in the rate of mental illness issues: that it may in fact be the result of expanded medical terms and definitions.
Slate.com warned of such a thing last April:
Beware the DSM-5, the soon-to-be-released fifth edition of the "psychiatric bible," the Diagnostic and Statistical Manual. The odds will probably be greater than 50 percent, according to the new manual, that you’ll have a mental disorder in your lifetime.
Although fewer than 6 percent of American adults will have a severe mental illness in a given year, according to a 2005 study, many more — more than a quarter each year — will have some diagnosable mental disorder. That’s a lot of people. Almost 50 percent of Americans (46.4 percent to be exact) will have a diagnosable mental illness in their lifetimes, based on the previous edition, the DSM-IV. And the new manual will likely make it even "easier" to get a diagnosis.
The expanded definitions have resulted in significant increases in diagnoses of mental disorders, particularly ADHD. Dr. Saul writes, "The New York Times reported that from 2008 to 2012 the number of adults taking medications for ADHD increased by 53% and that among young American adults, it nearly doubled."
But Saul contends that the diagnosis of ADHD overlooks the real problems. As Kyle Smith at the New York Post reports:
One girl he [Saul] treated, it turned out, was being disruptive in class because she couldn’t see the blackboard. Correct diagnois [sic]: myopia. She needed glasses, not drugs.
A 36-year-old man who complained about his addiction to online games and guessed he had ADHD, it turned out, was drinking too much coffee and sleeping only four to five hours a night. Correct diagnosis: sleep deprivation. He needed blackout shades, a white-noise machine and a program that shut all his devices off at midnight.
Smith concluded, "One by one, nearly all of Saul’s patients turned out to have some disease other than ADHD, such as Tourette’s, OCD, fragile X syndrome (a genetic mutation linked to mental retardation), autism, fetal alcohol syndrome, learning disabilities or such familiar conditions as substance abuse, poor hearing or even giftedness."
Some believe that the increase in these prescriptions results from parents looking to find an easy solution to their children's behavioral problems.
"There’s a societal trend to look for the quick fix, the magic bullet that will correct disruptive behaviors," said David Rubin, M.D., associate professor of pediatrics at the Perelman School of Medicine at the University of Pennsylvania in Philadelphia. "But for those looking for a quick solution to escalating behaviors at home, the hard truth is there is unlikely to be a quick fix."
Psychologist and parenting columnist John Rosemond echoes this sentiment, asserting that childhood misbehavior resulting from lack of discipline is incorrectly diagnosed as ADHD. As such, it is turning a discipline problem into a psychological problem.
Consumer Reports indicates, "Doctors are prescribing antipsychotics even though there’s minimal evidence that the drugs help kids for approved uses, much less the unapproved ones, such as behavioral problems. And to make matters worse, the little research there is suggests the drugs can cause troubling side effects, including weight gain, high cholesterol, and an increased risk of type-2 diabetes."
Consumer Reports also notes that the increase in the prescribing of antipsychotics can be attributed to several other factors, including aggressive drug marketing that overhypes the benefits of the pharmaceuticals and downplays their risks:
Antipsychotics have become huge moneymakers for the drug industry. In 2003, annual U.S. sales of the drugs were estimated at $2.8 billion; by 2011, that number had risen to $18.2 billion. That huge growth was driven in part by one company — Janssen Pharmaceuticals — and its aggressive promotion of off-label uses in children and elderly patients, relying on marketing tactics that according to the federal government, crossed legal and ethical lines.
And the NCHS study reports that children from poorer families are more likely to be medicated, a point observed by Dr. Rubin last year.
"Use is really high among kids in the Medicaid system where decent non-drug services may be difficult to find," says Rubin, who also points out that even kids with private insurance often don’t have coverage for psychiatric care or counseling.
Research shows that doctors are prescribing the drugs for "off-label" uses such as for ADHD and other diagnosed behavioral problems, which involve a significantly higher percentage of children than schizophrenia and bipolar disorder.
"What started out as a treatment with some level of evidence for a small sub-group of youth with significant development disabilities ... has been extended to cognitively normal kids without any strong evidence," Rubin said.
Critics observe that the side effects of anti-psychotic drugs could worsen — or in some cases actually cause — symptoms of mental illness.
According to the Citizens Commission on Human Rights International (CCHRINT), there is abundant evidence proving a connection between psychotropic medications and violent crimes, and government officials are well aware of the connection: “Between 2004 and 2011, there have been over 11,000 reports to the U.S. FDA’s MedWatch system of psychiatric drug side effects related to violence,” says CCHRINT, including 300 homicides.
Link:
http://www.thenewamerican.com/usnews/item/18145-cdc-one-in-13-children-taking-psych-meds
"...trying to use government and taxation to wipe out inequality never works and will only make society poorer. This is a lesson that Barack Obama, the Democrats, the Republicans, the mainstream media and the Pope all desperately need to learn."
The Pope Is Completely Wrong About Capitalism And Inequality
By Michael Snyder
On Monday, the following message was posted on the Pope's official Twitter account: "Inequality is the root of social evil." This follows on the heels of several other extremely harsh statements that he has made about capitalism over the past year. The Pope appears to believe that inequality is one of the greatest evils that humanity is facing. So if we redistributed all money and all property and made sure that everyone had an equal amount, would that wipe out social evil? Of course not. Such a notion is absolutely absurd. Being the Pope, he should know that the evil that we see all around us is not the result of the distribution of wealth. Rather, it is the result of humanity's deep rebellion against God. Yes, the fact that the wealth of the planet is being increasingly funneled to a very small minority at the top of the pyramid is a major problem. This is something that I have written about repeatedly. But the answer is not to make sure that everyone has the exact same amount of money and property. In the end, that would only turn us into North Korea.
In case you missed it, here is the tweet by the Pope that is causing such an uproar...
https://twitter.com/Pontifex/status/460697074585980928
By itself, that statement could perhaps be "interpreted" a number of different ways. But this follows other statements by the Pope that make it exceedingly clear what he is talking about. Here is one example...
Just as the commandment “Thou shalt not kill” sets a clear limit in order to safeguard the value of human life, today we also have to say “Thou shalt not” to an economy of exclusion and inequality. Such an economy kills. How can it be that it is not a news item when an elderly homeless person dies of exposure, but it is news when the stock market loses two points? This is a case of exclusion. Can we continue to stand by when food is thrown away while people are starving? This is a case of inequality.
Yes, the Pope is correct to highlight the plight of the homeless and the needy. Even in "wealthy America", we have an epidemic of hunger. This is something that I wrote about yesterday.
And yes, the Pope is correct to point out society's obsession with the stock market. Personally, I have been relentless in criticizing the big Wall Street banks.
But the solution is not to take everything away from everybody and put it into a giant pile and redistribute it equally.
History has shown us what happens when a society adopts an extreme form of socialism or communism.
The incentive to work is destroyed, the incentive to create new ideas and new businesses is destroyed, and living standards for everyone go down.
Please don't think that I am defending our current system. What we have in the United States today is not the kind of pure capitalism that our founders intended. Instead, it is a form of collectivism where nearly all of the economic power is now in the hands of giant collectivist institutions. That includes public collectivist institutions (the government) and private collectivist institutions (large corporations). In this type of economic environment, it should not be a surprise that government dependence is at an all-time high, the number of Americans that are self-employed is at an all-time low and millions of small businesses are being regulated out of existence.
Collectivism, socialism and communism are all close cousins. People are promised that such systems will result in greater "equality", but it never seems to actually work out that way. Instead, the small elite that hold all the power usually end up enjoying the vast majority of the benefits.
And without a doubt, as the power of the government and the power of the corporations has increased, inequality has been rising. Just check out the following chart from a new book by 42-year-old French economist Thomas Piketty entitled "Capital In The Twenty-First Century"...
As I write this, Pinketty's book is the number one seller on Amazon. That is pretty remarkable for an economics treatise. But Pinketty fails to realize what actually caused U.S. income inequality to start skyrocketing in 1971. As Brian Domitrovic recently detailed, that was the year when the U.S. completely went off the gold standard and the Federal Reserve started running wild...
The big switch to the foundation of the American financial structure at the advent of this period was the U.S. decision in 1971 to go off the gold standard. Before that time, it was basically clear that outside of wartime (when gold-standard conventions were often suspended), you could basically count on the dollar holding its value against major things like the consumer price level, foreign currencies, and commodities such as gold itself.
After 1971, in contrast, it became basically clear that you could count on no such thing. The CPI might go up 125% in one decade (as it did 1971-1981), the dollar could permanently lose 66% against major currencies (as it did against the yen in this period), and commodities could shoot up ten-to twenty-five fold (as was the case with oil and gold).
Therefore a new day in financial planning also arrived. Suddenly the importance of simply saving money diminished. Money that was saved also had to be hedged. If you simply saved money after 1971, you stood to get killed as the dollar lost value against things it was supposed to be able to procure in the future.
This is where the financial services industry began its long march upward in the share of U.S. economic output it gobbled up. People who had significant money—the rich—threw their money into the products offered by the financial sector, in that the worst thing to happen to a fortune diligently built up over the years would be to see it frittered away on account of currency depreciation.
So much has gone wrong since 1971. Out national debt has gotten more than 40 times larger, our economic infrastructure has been absolutely gutted and the value of the U.S. dollar has declined by well over 80 percent.
Once again, we need to go back to a system that much more closely resembles what our founders intended.
Did you know that the greatest period of economic growth in U.S. history was when there was no income tax, no IRS and no Federal Reserve?
We could have such a system again.
But the solutions being proposed by the mainstream media, our politicians and even the Pope involve even more centralization of economic power.
If we follow this path to the end, we will ultimately become like North Korea.
It is hard to describe the crushing poverty that exists in that hellhole of a country. In North Korea, there is so little electricity that the country appears almost totally dark from space at night. Just check out this picture taken by NASA...
North Korea may have more "equality" than we do, buy in that country "a ballpoint pen is considered a luxury item". Here is much more on what life is like for ordinary people inside North Korea from the New York Post...
Jobs often come without salaries. Those who do get a paycheck, earn, on average, between $1,000-2,000 a year. Food and clothing are rationed by the government.
Most North Koreans have access to that one TV station and one newspaper, both state-run; they are told that their country is the only functioning and prosperous nation on Earth and that outside rages an apocalypse. Only elites are allowed cellphones, but they can just make calls or text — there is no Internet.
Would you like to live in such a society?
When you take away the incentive to work and the incentive to create, you end up with a much poorer society. Without outside help, much of North Korea would have already starved to death by now...
“The majority of North Koreans believe completely in the regime,” says Barbara Demick, a Seoul-based journalist and author of “Nothing to Envy: Ordinary Lives in North Korea.”
“They are barely surviving,” she says. “Only the rich can afford to eat rice. They’re in a chronic state of food shortage.”
The average citizen eats twice a day — a manageable state of affairs for citizens who lived through the great famine of the ’90s, which reduced millions of people to eating tree bark and plucking undigested kernels of corn from animal excrement.
Yes, something needs to be done about the rising level of income inequality in our country. The middle class is being systematically destroyed and most of our politicians do not seem to care. Some big steps in that direction would be going back to a much purer form of capitalism, shutting down the Federal Reserve, changing laws to shift power much more in the direction of individuals and small businesses, and ending the practice of shipping millions of our good paying jobs to communist nations such as China.
We also need a massive shift in our culture. We need to shift away from a culture of greed and selfishness to a culture of love, compassion and generosity. Those that have been blessed have a responsibility to be a blessing. That is something that we have largely forgotten.
But trying to use government and taxation to wipe out inequality never works and will only make society poorer. This is a lesson that Barack Obama, the Democrats, the Republicans, the mainstream media and the Pope all desperately need to learn.
See the whole article here:http://theeconomiccollapseblog.com/archives/the-pope-is-completely-wrong-about-capitalism-and-inequality
By Michael Snyder
On Monday, the following message was posted on the Pope's official Twitter account: "Inequality is the root of social evil." This follows on the heels of several other extremely harsh statements that he has made about capitalism over the past year. The Pope appears to believe that inequality is one of the greatest evils that humanity is facing. So if we redistributed all money and all property and made sure that everyone had an equal amount, would that wipe out social evil? Of course not. Such a notion is absolutely absurd. Being the Pope, he should know that the evil that we see all around us is not the result of the distribution of wealth. Rather, it is the result of humanity's deep rebellion against God. Yes, the fact that the wealth of the planet is being increasingly funneled to a very small minority at the top of the pyramid is a major problem. This is something that I have written about repeatedly. But the answer is not to make sure that everyone has the exact same amount of money and property. In the end, that would only turn us into North Korea.
In case you missed it, here is the tweet by the Pope that is causing such an uproar...
https://twitter.com/Pontifex/status/460697074585980928
By itself, that statement could perhaps be "interpreted" a number of different ways. But this follows other statements by the Pope that make it exceedingly clear what he is talking about. Here is one example...
Just as the commandment “Thou shalt not kill” sets a clear limit in order to safeguard the value of human life, today we also have to say “Thou shalt not” to an economy of exclusion and inequality. Such an economy kills. How can it be that it is not a news item when an elderly homeless person dies of exposure, but it is news when the stock market loses two points? This is a case of exclusion. Can we continue to stand by when food is thrown away while people are starving? This is a case of inequality.
Yes, the Pope is correct to highlight the plight of the homeless and the needy. Even in "wealthy America", we have an epidemic of hunger. This is something that I wrote about yesterday.
And yes, the Pope is correct to point out society's obsession with the stock market. Personally, I have been relentless in criticizing the big Wall Street banks.
But the solution is not to take everything away from everybody and put it into a giant pile and redistribute it equally.
History has shown us what happens when a society adopts an extreme form of socialism or communism.
The incentive to work is destroyed, the incentive to create new ideas and new businesses is destroyed, and living standards for everyone go down.
Please don't think that I am defending our current system. What we have in the United States today is not the kind of pure capitalism that our founders intended. Instead, it is a form of collectivism where nearly all of the economic power is now in the hands of giant collectivist institutions. That includes public collectivist institutions (the government) and private collectivist institutions (large corporations). In this type of economic environment, it should not be a surprise that government dependence is at an all-time high, the number of Americans that are self-employed is at an all-time low and millions of small businesses are being regulated out of existence.
Collectivism, socialism and communism are all close cousins. People are promised that such systems will result in greater "equality", but it never seems to actually work out that way. Instead, the small elite that hold all the power usually end up enjoying the vast majority of the benefits.
And without a doubt, as the power of the government and the power of the corporations has increased, inequality has been rising. Just check out the following chart from a new book by 42-year-old French economist Thomas Piketty entitled "Capital In The Twenty-First Century"...
As I write this, Pinketty's book is the number one seller on Amazon. That is pretty remarkable for an economics treatise. But Pinketty fails to realize what actually caused U.S. income inequality to start skyrocketing in 1971. As Brian Domitrovic recently detailed, that was the year when the U.S. completely went off the gold standard and the Federal Reserve started running wild...
The big switch to the foundation of the American financial structure at the advent of this period was the U.S. decision in 1971 to go off the gold standard. Before that time, it was basically clear that outside of wartime (when gold-standard conventions were often suspended), you could basically count on the dollar holding its value against major things like the consumer price level, foreign currencies, and commodities such as gold itself.
After 1971, in contrast, it became basically clear that you could count on no such thing. The CPI might go up 125% in one decade (as it did 1971-1981), the dollar could permanently lose 66% against major currencies (as it did against the yen in this period), and commodities could shoot up ten-to twenty-five fold (as was the case with oil and gold).
Therefore a new day in financial planning also arrived. Suddenly the importance of simply saving money diminished. Money that was saved also had to be hedged. If you simply saved money after 1971, you stood to get killed as the dollar lost value against things it was supposed to be able to procure in the future.
This is where the financial services industry began its long march upward in the share of U.S. economic output it gobbled up. People who had significant money—the rich—threw their money into the products offered by the financial sector, in that the worst thing to happen to a fortune diligently built up over the years would be to see it frittered away on account of currency depreciation.
So much has gone wrong since 1971. Out national debt has gotten more than 40 times larger, our economic infrastructure has been absolutely gutted and the value of the U.S. dollar has declined by well over 80 percent.
Once again, we need to go back to a system that much more closely resembles what our founders intended.
Did you know that the greatest period of economic growth in U.S. history was when there was no income tax, no IRS and no Federal Reserve?
We could have such a system again.
But the solutions being proposed by the mainstream media, our politicians and even the Pope involve even more centralization of economic power.
If we follow this path to the end, we will ultimately become like North Korea.
It is hard to describe the crushing poverty that exists in that hellhole of a country. In North Korea, there is so little electricity that the country appears almost totally dark from space at night. Just check out this picture taken by NASA...
North Korea may have more "equality" than we do, buy in that country "a ballpoint pen is considered a luxury item". Here is much more on what life is like for ordinary people inside North Korea from the New York Post...
Jobs often come without salaries. Those who do get a paycheck, earn, on average, between $1,000-2,000 a year. Food and clothing are rationed by the government.
Most North Koreans have access to that one TV station and one newspaper, both state-run; they are told that their country is the only functioning and prosperous nation on Earth and that outside rages an apocalypse. Only elites are allowed cellphones, but they can just make calls or text — there is no Internet.
Would you like to live in such a society?
When you take away the incentive to work and the incentive to create, you end up with a much poorer society. Without outside help, much of North Korea would have already starved to death by now...
“The majority of North Koreans believe completely in the regime,” says Barbara Demick, a Seoul-based journalist and author of “Nothing to Envy: Ordinary Lives in North Korea.”
“They are barely surviving,” she says. “Only the rich can afford to eat rice. They’re in a chronic state of food shortage.”
The average citizen eats twice a day — a manageable state of affairs for citizens who lived through the great famine of the ’90s, which reduced millions of people to eating tree bark and plucking undigested kernels of corn from animal excrement.
Yes, something needs to be done about the rising level of income inequality in our country. The middle class is being systematically destroyed and most of our politicians do not seem to care. Some big steps in that direction would be going back to a much purer form of capitalism, shutting down the Federal Reserve, changing laws to shift power much more in the direction of individuals and small businesses, and ending the practice of shipping millions of our good paying jobs to communist nations such as China.
We also need a massive shift in our culture. We need to shift away from a culture of greed and selfishness to a culture of love, compassion and generosity. Those that have been blessed have a responsibility to be a blessing. That is something that we have largely forgotten.
But trying to use government and taxation to wipe out inequality never works and will only make society poorer. This is a lesson that Barack Obama, the Democrats, the Republicans, the mainstream media and the Pope all desperately need to learn.
See the whole article here:http://theeconomiccollapseblog.com/archives/the-pope-is-completely-wrong-about-capitalism-and-inequality
"Anti-anxiety drugs such as Valium and Xanax create dissociation, increased aggression, hallucinations and acute amnesia. They are extremely addictive, to boot."
Connection between psychiatry and military suicides revealed
by: J. D. Heyes
The U.S. military has been experiencing its highest-ever suicide rates for the past several years, and a new documentary film lays out shocking evidence that the Pentagon's reliance on psychotropic antidepressants is feeding the epidemic.
The film, The Hidden Enemy: Inside Psychiatry's Covert Agenda, produced by the Citizens Commission on Human Rights, details psychiatry's rise in the military, from World Wars I and II to its prominence today, laying bare a sinister conspiracy to use men and women in uniform as nothing more than guinea pigs.
As noted in the film, the dramatic rise in suicides can be directly linked to a similar dramatic rise in military personnel being prescribed mind-altering psychotropic drugs for "mental health conditions" that have never been scientifically validated. But why? What makes today's fighters so much less capable than those of earlier periods?
Is it "combat stress"? That's the reason being given by the military psychiatric community. But the very visible effects of combat stress have been chronicled by warriors, writers and other observers going back to ancient times.
The name has changed, but the treatment remains the same - and dangerous
In more recent conflicts, the condition has been called "soldier's heart," "battle fatigue" and "shell shock." Whatever its name, militaries for centuries have acknowledged that sometimes the horror of combat can get to be too much for soldiers to bear.
However, in 1980 the label changed again; psychiatrists began calling it "Post Traumatic Stress Disorder," or PTSD, and later they claimed, without supporting evidence, that it was some kind of brain dysfunction.
And like other psychiatric disorders, PTSD was never found through scientific testing. It was lobbied for by military psychiatrists and literally voted into psychiatry's Diagnostic and Statistical Manual.
Now, today, it is military psychiatry's most popular diagnosis, and it is rapidly spreading into the civilian world. Right now, more than one-third (37 percent) of war veterans are being "treated" for PTSD. Once they are diagnosed, 80 percent of them are given a psychiatric medication. Of those who are drugged, 89 percent are given antidepressants and 34 percent are given powerful antipsychotics.
As a result, spending on drugs to treat PTSD and brain injuries has rocketed up some 400 percent since 2003.
The thing is, if most psychiatrists ever bothered to actually do a physical exam, there is a good chance that they would find legitimate physical damage that could be verified with a brain scan. This condition is called "Traumatic Brain Injury," or TBI, and it is estimated that 320,000 soldiers suffer from this condition, which is most commonly caused by concussive blasts from roadside bombs known as IEDs, or improvised explosive devices. TBI can cause an inability to focus, short-term memory loss and trouble with rage or anger.
What's more, while this very real problem gets worse, it is often masked with mind-altering medications, which can make things much, much worse.
Deadly drugs that are addictive
Most people -- even those who have never served -- understand that life as a soldier is demanding. They are trained to be alert, decisive and focused, and are expected to be in top physical and mental shape. These are necessities.
Then why are so many being prescribed psychiatric drugs, when psychiatrists know that they make soldiers ineffective?
The problem is that nobody really knows what kind of physical and emotional side effects psychiatric drugs are going to cause. But those who make them and those who prescribe them know of their inherent risks:
-- Antidepressants commonly prescribed for PTSD can cause blurred vision, low blood pressure, internal bleeding, weight gain, seizures, impaired cardiac function, sexual impotence and psychosis.
-- Antipsychotics frequently given for sleeping problems are known to cause organ damage, body twitching, severe weight gain and sudden cardiac death.
-- Anti-anxiety drugs such as Valium and Xanax create dissociation, increased aggression, hallucinations and acute amnesia. They are extremely addictive, to boot.
-- Stimulants such as Adderall and Ritalin are also very addictive and can result in manic behavior, heart palpitations, brain shrinkage, insomnia, cardiac arrhythmia and even sudden death.
Psychiatrists will tell their patients that the benefits of these drugs outweigh any risks, but that is simply not true. There have been large studies proving that psychiatric drugs like antidepressants and antipsychotics work no better than a dummy sugar pill.
And some of them -- especially when taken in combination with each other -- can be deadly.
Watch the shocking documentary on the connection between psychiatry and military suicides today.
Learn more: http://www.naturalnews.com/044909_psychiatry_military_suicides_psychotropic_drugs.html#ixzz30GzyXZMo
by: J. D. Heyes
The U.S. military has been experiencing its highest-ever suicide rates for the past several years, and a new documentary film lays out shocking evidence that the Pentagon's reliance on psychotropic antidepressants is feeding the epidemic.
The film, The Hidden Enemy: Inside Psychiatry's Covert Agenda, produced by the Citizens Commission on Human Rights, details psychiatry's rise in the military, from World Wars I and II to its prominence today, laying bare a sinister conspiracy to use men and women in uniform as nothing more than guinea pigs.
As noted in the film, the dramatic rise in suicides can be directly linked to a similar dramatic rise in military personnel being prescribed mind-altering psychotropic drugs for "mental health conditions" that have never been scientifically validated. But why? What makes today's fighters so much less capable than those of earlier periods?
Is it "combat stress"? That's the reason being given by the military psychiatric community. But the very visible effects of combat stress have been chronicled by warriors, writers and other observers going back to ancient times.
The name has changed, but the treatment remains the same - and dangerous
In more recent conflicts, the condition has been called "soldier's heart," "battle fatigue" and "shell shock." Whatever its name, militaries for centuries have acknowledged that sometimes the horror of combat can get to be too much for soldiers to bear.
However, in 1980 the label changed again; psychiatrists began calling it "Post Traumatic Stress Disorder," or PTSD, and later they claimed, without supporting evidence, that it was some kind of brain dysfunction.
And like other psychiatric disorders, PTSD was never found through scientific testing. It was lobbied for by military psychiatrists and literally voted into psychiatry's Diagnostic and Statistical Manual.
Now, today, it is military psychiatry's most popular diagnosis, and it is rapidly spreading into the civilian world. Right now, more than one-third (37 percent) of war veterans are being "treated" for PTSD. Once they are diagnosed, 80 percent of them are given a psychiatric medication. Of those who are drugged, 89 percent are given antidepressants and 34 percent are given powerful antipsychotics.
As a result, spending on drugs to treat PTSD and brain injuries has rocketed up some 400 percent since 2003.
The thing is, if most psychiatrists ever bothered to actually do a physical exam, there is a good chance that they would find legitimate physical damage that could be verified with a brain scan. This condition is called "Traumatic Brain Injury," or TBI, and it is estimated that 320,000 soldiers suffer from this condition, which is most commonly caused by concussive blasts from roadside bombs known as IEDs, or improvised explosive devices. TBI can cause an inability to focus, short-term memory loss and trouble with rage or anger.
What's more, while this very real problem gets worse, it is often masked with mind-altering medications, which can make things much, much worse.
Deadly drugs that are addictive
Most people -- even those who have never served -- understand that life as a soldier is demanding. They are trained to be alert, decisive and focused, and are expected to be in top physical and mental shape. These are necessities.
Then why are so many being prescribed psychiatric drugs, when psychiatrists know that they make soldiers ineffective?
The problem is that nobody really knows what kind of physical and emotional side effects psychiatric drugs are going to cause. But those who make them and those who prescribe them know of their inherent risks:
-- Antidepressants commonly prescribed for PTSD can cause blurred vision, low blood pressure, internal bleeding, weight gain, seizures, impaired cardiac function, sexual impotence and psychosis.
-- Antipsychotics frequently given for sleeping problems are known to cause organ damage, body twitching, severe weight gain and sudden cardiac death.
-- Anti-anxiety drugs such as Valium and Xanax create dissociation, increased aggression, hallucinations and acute amnesia. They are extremely addictive, to boot.
-- Stimulants such as Adderall and Ritalin are also very addictive and can result in manic behavior, heart palpitations, brain shrinkage, insomnia, cardiac arrhythmia and even sudden death.
Psychiatrists will tell their patients that the benefits of these drugs outweigh any risks, but that is simply not true. There have been large studies proving that psychiatric drugs like antidepressants and antipsychotics work no better than a dummy sugar pill.
And some of them -- especially when taken in combination with each other -- can be deadly.
Watch the shocking documentary on the connection between psychiatry and military suicides today.
Learn more: http://www.naturalnews.com/044909_psychiatry_military_suicides_psychotropic_drugs.html#ixzz30GzyXZMo
"...the number of Americans getting money or benefits from the federal government each month exceeds the number of full-time workers in the private sector by more than 60 million."
The Real Unemployment Rate: In 20% Of American Families, EVERYONE Is Unemployed
By Michael Snyder
According to shocking new numbers that were just released by the Bureau of Labor Statistics, 20 percent of American families do not have a single person that is working. So when someone tries to tell you that the unemployment rate in the United States is about 7 percent, you should just laugh. One-fifth of the families in the entire country do not have a single member with a job. That is absolutely astonishing. How can a family survive if nobody is making any money? Well, the answer to that question is actually quite easy. There is a reason why government dependence has reached epidemic levels in the United States. Without enough jobs, tens of millions of additional Americans have been forced to reach out to the government for help. At this point, if you can believe it, the number of Americans getting money or benefits from the federal government each month exceeds the number of full-time workers in the private sector by more than 60 million.
When I was growing up, it seemed like anyone that was willing to work hard could find a good paying job. But now that has all changed. At this point, 20 percent of all the families in the entire country do not have a single member that has a job. That includes fathers, mothers and children. The following is how CNSNews.com broke down the numbers…
A family, as defined by the BLS, is a group of two or more people who live together and who are related by birth, adoption or marriage. In 2013, there were 80,445,000 families in the United States and in 16,127,000—or 20 percent–no one had a job.
To be honest, these really are Great Depression-type numbers. But over the years “unemployment” has been redefined so many times that it doesn’t mean the same thing that it once did. The government tells us that the official unemployment rate is about 7 percent, but that number is almost meaningless at this point.
A number that I find much more useful is the employment-population ratio. According to the employment-population ratio, the percentage of working age Americans that actually have a job has been below 59 percent for more than four years in a row…
Read the rest here:
http://endoftheamericandream.com/archives/the-real-unemployment-rate-in-20-of-american-families-everyone-is-unemployed
By Michael Snyder
According to shocking new numbers that were just released by the Bureau of Labor Statistics, 20 percent of American families do not have a single person that is working. So when someone tries to tell you that the unemployment rate in the United States is about 7 percent, you should just laugh. One-fifth of the families in the entire country do not have a single member with a job. That is absolutely astonishing. How can a family survive if nobody is making any money? Well, the answer to that question is actually quite easy. There is a reason why government dependence has reached epidemic levels in the United States. Without enough jobs, tens of millions of additional Americans have been forced to reach out to the government for help. At this point, if you can believe it, the number of Americans getting money or benefits from the federal government each month exceeds the number of full-time workers in the private sector by more than 60 million.
When I was growing up, it seemed like anyone that was willing to work hard could find a good paying job. But now that has all changed. At this point, 20 percent of all the families in the entire country do not have a single member that has a job. That includes fathers, mothers and children. The following is how CNSNews.com broke down the numbers…
A family, as defined by the BLS, is a group of two or more people who live together and who are related by birth, adoption or marriage. In 2013, there were 80,445,000 families in the United States and in 16,127,000—or 20 percent–no one had a job.
To be honest, these really are Great Depression-type numbers. But over the years “unemployment” has been redefined so many times that it doesn’t mean the same thing that it once did. The government tells us that the official unemployment rate is about 7 percent, but that number is almost meaningless at this point.
A number that I find much more useful is the employment-population ratio. According to the employment-population ratio, the percentage of working age Americans that actually have a job has been below 59 percent for more than four years in a row…
Read the rest here:
http://endoftheamericandream.com/archives/the-real-unemployment-rate-in-20-of-american-families-everyone-is-unemployed
"...children who received antidepressants had twice the rate of suicidal ideation and behavior than children who were given a placebo."
High doses of antidepressants appear to increase risk of self-harm in children young adult
Medical Press
Children and young adults who start antidepressant therapy at high doses, rather than the "modal" [average or typical] prescribed doses, appear to be at greater risk for suicidal behavior during the first 90 days of treatment.
A previous meta-analysis by the U.S. Food and Drug Administration (FDA) of antidepressant trials suggested that children who received antidepressants had twice the rate of suicidal ideation and behavior than children who were given a placebo. The authors of the current study sought to examine suicidal behavior and antidepressant dose, and whether risk depended on a patient's age.
The study used data from 162,625 people (between the ages of 10 to 64 years) with depression who started antidepressant treatment with a selective serotonin reuptake inhibitor at modal (the most prescribed doses on average) or at higher than modal doses from 1998 through 2010.
The rate of suicidal behavior (deliberate self-harm or DSH) among children and adults (24 years or younger) who started antidepressant therapy at high doses was about twice as high compared with a matched group of patients who received generally prescribed doses. The authors suggest this corresponds to about one additional event of DSH for every 150 patients treated with high-dose therapy. For adults 25 to 64 years old, the difference in risk for suicidal behavior was null. The study does not address why higher doses might lead to higher suicide risk.
"Considered in light of recent meta-analyses concluding that the efficacy of antidepressant therapy for youth seems to be modest, and separate evidence that dose is generally unrelated to the therapeutic efficacy of antidepressants, our findings offer clinicians an additional incentive to avoid initiating pharmacotherapy at high-therapeutic doses and to monitor all patients starting antidepressants, especially youth, for several months and regardless of history of DSH." Matthew Miller, M.D., Sc.D., of the Harvard School of Public Health, Boston, and colleagues wrote in their JAMA Internal Medicine article.
In a related commentary, David A. Brent, M.D., of the University of Pittsburgh, and Robert Gibbons, Ph.D., of the University of Chicago, write: "In summary Miller et al are to be commended on a thoughtful and careful analysis of the effects of initiating antidepressants at higher than modal doses."
"Their findings suggest that higher than modal initial dosing leads to an increased risk for DSH and adds further support to current clinical recommendations to begin treatment with lower antidepressant doses. While initiation at higher than modal doses of antidepressants may be deleterious, this study does not address the effect of dose escalation," they continue.
"Moreover, while definitive studies on the impact of dose escalation in the face of nonresponse remain to be done, there are promising studies that suggest in certain subgroups, dose escalation can be of benefit. Finally it should be noted that in this study, there was no pre-exposure to post-exposure increase in suicidal behavior after the initiation of antidepressants in youth treated at the modal dosage," they conclude.
Link:
http://medicalxpress.com/news/2014-04-high-doses-antidepressants-self-harm-children.html
Medical Press
Children and young adults who start antidepressant therapy at high doses, rather than the "modal" [average or typical] prescribed doses, appear to be at greater risk for suicidal behavior during the first 90 days of treatment.
A previous meta-analysis by the U.S. Food and Drug Administration (FDA) of antidepressant trials suggested that children who received antidepressants had twice the rate of suicidal ideation and behavior than children who were given a placebo. The authors of the current study sought to examine suicidal behavior and antidepressant dose, and whether risk depended on a patient's age.
The study used data from 162,625 people (between the ages of 10 to 64 years) with depression who started antidepressant treatment with a selective serotonin reuptake inhibitor at modal (the most prescribed doses on average) or at higher than modal doses from 1998 through 2010.
The rate of suicidal behavior (deliberate self-harm or DSH) among children and adults (24 years or younger) who started antidepressant therapy at high doses was about twice as high compared with a matched group of patients who received generally prescribed doses. The authors suggest this corresponds to about one additional event of DSH for every 150 patients treated with high-dose therapy. For adults 25 to 64 years old, the difference in risk for suicidal behavior was null. The study does not address why higher doses might lead to higher suicide risk.
"Considered in light of recent meta-analyses concluding that the efficacy of antidepressant therapy for youth seems to be modest, and separate evidence that dose is generally unrelated to the therapeutic efficacy of antidepressants, our findings offer clinicians an additional incentive to avoid initiating pharmacotherapy at high-therapeutic doses and to monitor all patients starting antidepressants, especially youth, for several months and regardless of history of DSH." Matthew Miller, M.D., Sc.D., of the Harvard School of Public Health, Boston, and colleagues wrote in their JAMA Internal Medicine article.
In a related commentary, David A. Brent, M.D., of the University of Pittsburgh, and Robert Gibbons, Ph.D., of the University of Chicago, write: "In summary Miller et al are to be commended on a thoughtful and careful analysis of the effects of initiating antidepressants at higher than modal doses."
"Their findings suggest that higher than modal initial dosing leads to an increased risk for DSH and adds further support to current clinical recommendations to begin treatment with lower antidepressant doses. While initiation at higher than modal doses of antidepressants may be deleterious, this study does not address the effect of dose escalation," they continue.
"Moreover, while definitive studies on the impact of dose escalation in the face of nonresponse remain to be done, there are promising studies that suggest in certain subgroups, dose escalation can be of benefit. Finally it should be noted that in this study, there was no pre-exposure to post-exposure increase in suicidal behavior after the initiation of antidepressants in youth treated at the modal dosage," they conclude.
Link:
http://medicalxpress.com/news/2014-04-high-doses-antidepressants-self-harm-children.html
Short and sweet...
You Mean the Anti-Peace Process, Don’t You?
Thomas DiLorenzo
Every time there is an election in the Middle East the sock-puppets-for-the-state (a.k.a. “mainstream media”) bloviate endlessly about what this might mean for “the peace process.” Huh? What? What peace process? For my entire life, all U.S. government meddling in the Middle East has done is to generate more and more conflict. It is an anti-peace process and always has been. If peace ever broke out in the Middle East, too many wealthy and powerful people in the U.S. and the Middle East would lose most or all of their political power and much of their wealth.
Link:
http://www.lewrockwell.com/lrc-blog/you-mean-the-anti-peace-process-dont-you/
Thomas DiLorenzo
Every time there is an election in the Middle East the sock-puppets-for-the-state (a.k.a. “mainstream media”) bloviate endlessly about what this might mean for “the peace process.” Huh? What? What peace process? For my entire life, all U.S. government meddling in the Middle East has done is to generate more and more conflict. It is an anti-peace process and always has been. If peace ever broke out in the Middle East, too many wealthy and powerful people in the U.S. and the Middle East would lose most or all of their political power and much of their wealth.
Link:
http://www.lewrockwell.com/lrc-blog/you-mean-the-anti-peace-process-dont-you/
"Like the terminal adolescents they are, the leaders of both parties in the United States and their counterparts in NATO and the EU cannot bring themselves to admit the clear and simple fact that they are responsible for the festering problem in Ukraine..."
Obama, McCain, and the NATO/EU Gang: Better War than Saying: “It’s our Fault”?
By Michael Scheuer
Once again Americans are watching their government involve itself in an issue in which the United States has nothing at stake economically and no genuine national security interest at risk. Ukraine is a place that is worth neither a single American dollar nor more than a brief scan of the headlines by U.S. citizens. And yet Obama and his fellow European interveners and democracy mongers are conducting themselves in a bellicose manner that could lead to some kind of military conflict in Eastern Europe. Indeed, they already are conducting warfare against Russia via economic sanctions, a punitive exercise they promise to make more severe in the next few weeks.
And for what? When all is said and done Obama and Team Democracy appear to prefer a war to publicly admitting that it was their democracy crusading last winter in Kiev that brought on this worrying and sharpening confrontation. Into a increasingly bitter political battle between the Kiev regime and its domestic opponents, the EU as an organization and individual European governments sent a steady flow of diplomats, officials, and money to help the Ukrainian opposition prevail over the Kiev regime. This foreign intervention in a purely internal domestic dispute was clearly designed to overthrow the legitimate Ukrainian government. It is the kind of imperialist exercise that the UN was created to condemn and stop, but that organization’s recent history shows that it now exists solely to support unjustified — and usually unjustifiable — U.S. and Western political and military interventions.
We will never know how the internal Ukrainian dispute would have worked itself out if the Ukrainians had been allowed an exercise in self-determination, but what we do we do know is that the EU’s arrogant intervention in the country’s internal affairs tipped the scales in the opposition’s favor and led to the Kiev regime’s collapse. And we know that Obama, Kerrey, and Biden steered clear of the problem until they saw that the EU’s intervention might succeed. Faced with that reality, these U.S. leaders put their best interventionist foot forward and joined the Europeans to wreck both Ukraine and European stability in the name of a democracy that will never see the light of day in Kiev. Washington, NATO, and Brussels are now well on their way to creating in Ukraine the same kind of democratic paradise they previously delivered in Egypt, Libya, and South Sudan.
They are also striding cluelessly along a road that could lead to a war in Eastern Europe. Why? Simple. The democracy mongers operate on the assumption that only the United States and Europe have legitimate national interests. Actions taken by non-Western states to defend what they perceive to be life-and-death national interests are labeled by Washington and NATO as illegitimate, aggressive, war-causing operations. But hold on for a moment. Was it Russia that intentionally fomented revolution in Ukraine? No, there is no evidence of that. Was it Moscow that publicly threatened the Ukrainian opposition with force and trials for war crimes? No, it was the West and the UN who treated the legitimate Ukrainian regime in that manner. So it was, in fact, Washington, NATO, and the EU who took a solely internal Ukrainian conflict and, by intervening in favor of anti-Russian Ukrainians, made it into a showdown between the West and Russia.
About Mr. Putin. One must say that he is not a particularly likeable man, and he is, after all, the legatee of a political system that killed and starved-to-death 60-plus million people. (NB: Odd, is it not, that the West spends years and billions of dollars tracking down a handful of two-bit Serbian and African murderers, but never utters a word about Russian and Chinese genocide-merchants who have killed far more than 100 million people?) Anyway, what has Putin done that makes him and Russia the sole bad guys in this sorry Ukrainian drama?
Well, Mr. Putin had the gall to see that the aggressive but always effete democracy mongers were mindlessly intent on a regime-change operation in Kiev that would put an anti-Russian regime in power, increase animosities between the country’s ethnic Ukrainians and ethnic Russians, and destabilize Ukraine and perhaps have a knock-on destabilizing impact along much of Russia’s western border. Faced with this prospect, Putin unleashed his armored columns and took Kiev and all Ukraine, right? No. Faced with what the West was doing to make Ukraine an anti-Russian bastion and promote civil war in the country, Putin simply did what genuine Russian national interests required, he took what always has been and always must be Russia’s, the Crimea and its naval bases. Any Westerner who claims he was surprised by this action — or the cause of it — is either a liar or ignorant of Russian history. Given the state of Western education, the latter is at least as likely as the former.
So Putin takes Crimea and it votes to join Russia. End of crisis? No. Even though it is obvious that U.S.-NATO-EU intervention caused the crisis in the first place, the democracy mongers sanction Russia for protecting its national interests and then pick-up the pace of intervention by pumping funds into the anti-Russian regime in Kiev, deploying U.S. military forces in NATO’s Eastern European members, and Obama trying to prove he is not the terminal adolescent that Putin knows he is by waging war against Russia via sanctions. And, of course, there on the sidelines, are America’s Neoconservatives urging the West to threaten the use military force against Russia and at least heavily arm the illegitimate government now operating in Kiev.
Like the terminal adolescents they are, the leaders of both parties in the United States and their counterparts in NATO and the EU cannot bring themselves to admit the clear and simple fact that they are responsible for the festering problem in Ukraine. They have encountered in Putin a man who is unsavory and no hero but one who is a thorough-going nationalist who will not roll over and play dead and abandon his country’s security interests because the intervention-addicted Western democracy mongers demand he do so. Western pride, historical ignorance, and hubris makes admitting a mistake impossible, so we continue meandering toward war.
A final word on sanctions. Western interventionists ought to recall that (a) economic sanctions are attacks on the targeted nation that amount to acts of war, and (b) economic sanctions that savage an already fragile economy — like Russia’s — can make the attacked state opt for war as a last resort. Americans still debate whether FDR’s sanctions against Japan were an attempt to change Tokyo’s foreign policy or to force Japan to start a Pacific war FDR wanted to fight but the American people overwhelmingly opposed. Which side of that debate is accurate is irrelevant here, and perhaps it is unknowable. What is irrefutable fact, however, is that FDR’s sanctions forced Imperial Japan to decide between war and the withering away of its economic and military power and the eventual termination of its status as a Great Power. Even the West’s ill-educated leaders must know the decision Imperial Japan took as the result of FDR’s sanctions.
Link:
http://www.lewrockwell.com/2014/04/michael-scheuer/the-usnatoeu-gang/
By Michael Scheuer
Once again Americans are watching their government involve itself in an issue in which the United States has nothing at stake economically and no genuine national security interest at risk. Ukraine is a place that is worth neither a single American dollar nor more than a brief scan of the headlines by U.S. citizens. And yet Obama and his fellow European interveners and democracy mongers are conducting themselves in a bellicose manner that could lead to some kind of military conflict in Eastern Europe. Indeed, they already are conducting warfare against Russia via economic sanctions, a punitive exercise they promise to make more severe in the next few weeks.
And for what? When all is said and done Obama and Team Democracy appear to prefer a war to publicly admitting that it was their democracy crusading last winter in Kiev that brought on this worrying and sharpening confrontation. Into a increasingly bitter political battle between the Kiev regime and its domestic opponents, the EU as an organization and individual European governments sent a steady flow of diplomats, officials, and money to help the Ukrainian opposition prevail over the Kiev regime. This foreign intervention in a purely internal domestic dispute was clearly designed to overthrow the legitimate Ukrainian government. It is the kind of imperialist exercise that the UN was created to condemn and stop, but that organization’s recent history shows that it now exists solely to support unjustified — and usually unjustifiable — U.S. and Western political and military interventions.
We will never know how the internal Ukrainian dispute would have worked itself out if the Ukrainians had been allowed an exercise in self-determination, but what we do we do know is that the EU’s arrogant intervention in the country’s internal affairs tipped the scales in the opposition’s favor and led to the Kiev regime’s collapse. And we know that Obama, Kerrey, and Biden steered clear of the problem until they saw that the EU’s intervention might succeed. Faced with that reality, these U.S. leaders put their best interventionist foot forward and joined the Europeans to wreck both Ukraine and European stability in the name of a democracy that will never see the light of day in Kiev. Washington, NATO, and Brussels are now well on their way to creating in Ukraine the same kind of democratic paradise they previously delivered in Egypt, Libya, and South Sudan.
They are also striding cluelessly along a road that could lead to a war in Eastern Europe. Why? Simple. The democracy mongers operate on the assumption that only the United States and Europe have legitimate national interests. Actions taken by non-Western states to defend what they perceive to be life-and-death national interests are labeled by Washington and NATO as illegitimate, aggressive, war-causing operations. But hold on for a moment. Was it Russia that intentionally fomented revolution in Ukraine? No, there is no evidence of that. Was it Moscow that publicly threatened the Ukrainian opposition with force and trials for war crimes? No, it was the West and the UN who treated the legitimate Ukrainian regime in that manner. So it was, in fact, Washington, NATO, and the EU who took a solely internal Ukrainian conflict and, by intervening in favor of anti-Russian Ukrainians, made it into a showdown between the West and Russia.
About Mr. Putin. One must say that he is not a particularly likeable man, and he is, after all, the legatee of a political system that killed and starved-to-death 60-plus million people. (NB: Odd, is it not, that the West spends years and billions of dollars tracking down a handful of two-bit Serbian and African murderers, but never utters a word about Russian and Chinese genocide-merchants who have killed far more than 100 million people?) Anyway, what has Putin done that makes him and Russia the sole bad guys in this sorry Ukrainian drama?
Well, Mr. Putin had the gall to see that the aggressive but always effete democracy mongers were mindlessly intent on a regime-change operation in Kiev that would put an anti-Russian regime in power, increase animosities between the country’s ethnic Ukrainians and ethnic Russians, and destabilize Ukraine and perhaps have a knock-on destabilizing impact along much of Russia’s western border. Faced with this prospect, Putin unleashed his armored columns and took Kiev and all Ukraine, right? No. Faced with what the West was doing to make Ukraine an anti-Russian bastion and promote civil war in the country, Putin simply did what genuine Russian national interests required, he took what always has been and always must be Russia’s, the Crimea and its naval bases. Any Westerner who claims he was surprised by this action — or the cause of it — is either a liar or ignorant of Russian history. Given the state of Western education, the latter is at least as likely as the former.
So Putin takes Crimea and it votes to join Russia. End of crisis? No. Even though it is obvious that U.S.-NATO-EU intervention caused the crisis in the first place, the democracy mongers sanction Russia for protecting its national interests and then pick-up the pace of intervention by pumping funds into the anti-Russian regime in Kiev, deploying U.S. military forces in NATO’s Eastern European members, and Obama trying to prove he is not the terminal adolescent that Putin knows he is by waging war against Russia via sanctions. And, of course, there on the sidelines, are America’s Neoconservatives urging the West to threaten the use military force against Russia and at least heavily arm the illegitimate government now operating in Kiev.
Like the terminal adolescents they are, the leaders of both parties in the United States and their counterparts in NATO and the EU cannot bring themselves to admit the clear and simple fact that they are responsible for the festering problem in Ukraine. They have encountered in Putin a man who is unsavory and no hero but one who is a thorough-going nationalist who will not roll over and play dead and abandon his country’s security interests because the intervention-addicted Western democracy mongers demand he do so. Western pride, historical ignorance, and hubris makes admitting a mistake impossible, so we continue meandering toward war.
A final word on sanctions. Western interventionists ought to recall that (a) economic sanctions are attacks on the targeted nation that amount to acts of war, and (b) economic sanctions that savage an already fragile economy — like Russia’s — can make the attacked state opt for war as a last resort. Americans still debate whether FDR’s sanctions against Japan were an attempt to change Tokyo’s foreign policy or to force Japan to start a Pacific war FDR wanted to fight but the American people overwhelmingly opposed. Which side of that debate is accurate is irrelevant here, and perhaps it is unknowable. What is irrefutable fact, however, is that FDR’s sanctions forced Imperial Japan to decide between war and the withering away of its economic and military power and the eventual termination of its status as a Great Power. Even the West’s ill-educated leaders must know the decision Imperial Japan took as the result of FDR’s sanctions.
Link:
http://www.lewrockwell.com/2014/04/michael-scheuer/the-usnatoeu-gang/
Subscribe to:
Posts (Atom)