Binary Options Trading Explained | Strategies & Platforms

[Serial][UWDFF Alcubierre] Part 49

Beginning | Previous
Joan opened a link to Ambassador Amahle Mandela. Soon after, the ambassador's face filled a portion of the Admiral's Bridge. She had large, luminous brown eyes that seemed to swallow the upper portion of her face, complimenting her umber tone. Amahle smiled broadly, as she always did, once the comm link as connected.
"Admiral Orléans, I assume we are approaching the departure time?"
Joan nodded, "The Zix vessel will project a wormhole to Halcyon shortly. We have made what preparations we can, but it will be a highly fluid environment."
Amahle's smile did not diminish, the pearly whites still shined in full force. "I am familiar with dynamic situations, Admiral, as you well know. I understand the parameters of this mission, and will abide by them so as long you do the same."
Joan's lips pressed together as she regarded the ambassador. Joan had had limited interactions with Amahle prior to her boarding the Oppenheimer. Amahle was a relative newcomer to the highest echelons of political power within the United World, but her ascent had been rapid. She hailed from a prominent political family that had exerted considerable influence over the generations that had led the African continent to position of power it now occupied. Well-sourced references had called her bold and decisive. All things considered, Joan understood why Damian had chosen her, though she would have preferred a diplomat she had more personal experience with. Still, unknown and competent was preferred to known and incompetent.
Joan dipped her chin, offering her agreement. "A diplomatic outcome is the preferred outcome, Ambassador. There's no benefit to antagonizing a foe we do not understand. "
"Not a foe, Admiral. We must not draw lines that place us on one side and them on the other. They have suffered injury at our hands, no matter how unintentional, and we must accept our responsibility in that. We must hope that we are given the opportunity to provide context to the unlikely chain of events that has brought us to this point. We are both the victim of cosmic circumstance. There is no need for further hostility."
Joan leaned forward in her chair slightly, "The priority, Ambassador, is the return of Admiral Kai Levinson. I will not stand in the way of peace, but any outcome that does not contemplate the return of a senior member of our military leadership is unacceptable."
Amahle shrugged, "So it is. The priority is clear in my mind, but I do not view the goals of securing peace and the return of the Admiral as mutually exclusive."
Joan offered a low chuckle. "Just probably exclusive."
"I disagree, but time shall be the arbiter of the matter."
"So long as you understand that, if the opportunity to secure Admiral Levinson presents itself, I'll avail myself of that opportunity, we should have no problems."
"That seems an unlikely outcome. The Admiral was ensconced in a shielded holding cell when the Alcubierre departed. The past few days are unlikely to have changed that outcome."
A barking laugh came out of Joan, rising up from deep within her.
For the first time, Amahle's smile faltered.
-----------
Left. Right. Straight. Left. Left.
Kai followed the directions without thinking about them, following an intuitive sense of direction that the Overseer fed to him. This portion of Halcyon appeared to be a never-ending series of corridors, all of which looked the same. The only thing that did seem to change were the inhabitants. If he was less preoccupied with the task at hand, Kai might have spared a second glance for the odd creatures that popped into existence during his mad dash. As it stood, they were just a part of the scenery, becoming relevant only if Neeria indicated they might pose a threat. So far, Kai had been fortunate, with few obstacles popping up to impede his progress.
He careened around a corner, the odd, weightless orb still tucked in the crook of his left arm. He bounced off the opposite wall, leaving a sizeable dent and then hurtled forward. Ahead the corridor opened up, and the brighter light of a mainway filtered in. Somehow, Neeria had managed to navigate him through the maze and bring him back to the mainway separating him from where he had left the Overseer. Unfortunately, evasion was no longer a possibility. In order to return to the Overseer, he would need to traverse the mainway.
The mainway was already a sea of red dots. Peacekeepers. Dozens of them. Some pulsed red, indicating lethal enforcement squads. Fortunately, they were stretched along a long section of the mainway rather than being specifically concentrated around his planned entrance point, though they there were beginning to redeploy in his direction. Still, any crossing would be potentially treacherous. Neeria disagreed with that assessment, instead considering any attempt to cross aggressively suicidal.
Kai rolled his eyes as he continued to barrel down the hallway. "Half the time, this works all the time."
What could only be described as a mental barrage ensued as Neeria assailed the statement. The words were nonsensical on their face. At best, it was an argument for a fifty percent failure rating, which was a substantial risk. Additionally, she had scoured his thoughts for the evidentiary basis for the fifty percent estimate and found no supporting facts. The sentiment was based entirely on supposition, hubris and was entirely divorced from reality. Her estimate of a three percent success rate was significantly more likely to be accurate, particularly when her superior familiarity with the assets in play were considered.
Kai wasn't sure if the Evangi had lungs, but, if they did, Kai was pretty certain Neeria was in the process of hyperventilating. Kai suppressed a childish giggle.
"All right, all right. Have it your way," he said.
The Overseer relaxed somewhat, pleased that she had impacted his thinking and already putting together the basis for an alternate route. It would take substantially longer and require him to obtain a large box, a micro-fitted multiwanzer and shave his head, but it may just work.
It was a nice sentiment, but they were out of time. The countdown clock had started the second Neeria had fled the Council chamber, and made her way to Kai. They either found a way out of Halcyon now or they were screwed. There were no options but bad ones. So be it. Kai clutched the orb tightly and ducked his head down, his speed increasing as he charged toward the mainway entrance. "Three percent of the time, this works all the time."
The mental hyperventilating returned and redoubled as the Overseer scrambled to explain that he had drawn the wrong conclusion. Three percent was a basis for not continuing toward the mainway, not charging forward. There were constraints on their time, but those limitations were poorly defined while the threat in the mainway was certain. Eventually her location would be discovered and she would be apprehended, but there was no guarantee it would happen if Kai were to take a safer route the attempted to avoid confrontation.
Her stream of consciousness intermingled with his, pleading with him to change course. There was no sense in doing this. There were too many of them, and only one of him. The galaxy could not afford to lose him, he was important. Humans were important. Kai could feel the enormous weight of responsibility bearing down on Neeria. She now regretted having sent him for the encryption key, even that was of less importance than him. Panic bubbled up within Neeria as the entrance to the mainway loomed ahead.
A pushed a thought toward her, somehow piercing her consciousness with his own. A single thought, pure and focused. Reassurance. He would be fine. He had come this far, and he had never started something he couldn't finish.
He crouched and then sprang forward, vaulting from the ground and into the open air high above the mainway. A sea of red dots were scrambling around him. One hundred and twenty-one peacekeepers. Eight non-lethal squads and four lethal squads. Restrainer triads. Psych triads. Terminator triads. All moving in seamless harmony under the command of a single being. The name came to Kai from the ethereum of Neeria's mind, Bo'Bakka'Gah was here, leading the response.
Before Kai could determine what a Bo'Bakka'Gah was and why it should matter, he was blinded by a beam of light. A sickening crunch followed as he was slammed against the ceiling of the mainway. The encryption key popped out from his arm and began to fall toward the ground, dozens of feet below.
-------------
Xy: Such a thing is not possible.
Zyy: Yes. In some matters, it is better to speak only truths, Grand Jack. It is best to leave these matters aside. This subject will only provoke the Combine.
Jack frowned, puzzled by the feedback. He had been speaking truths. Earth's history was what it was, for better or worse, he had no reason to obscure it.
Griggs: It was a terrible time for Humanity. We almost did not survive it, but we did. I developed a means for combating the artificient. Kai and Joan used it to destroy them.
Xy: Then it was not an artificient.
Zyy: Yes. This is correct. If it is destroyed then it is not an artificient.
Griggs: I am confused. An artificient is an artificial, sentient being, correct?
Xy: That is Quantic in nature.
Jack nodded, that distinction made sense. Humanity had built any number of artificial intelligences prior to the Automics. They had posed no threat to Humanity. It was only with the quantum computing revolution that a rogue artificial intelligences had surfaced. Jack had studied the phenomenon with considerable interest, poking and prodding at the crux of distinction. It lay in the move from bits to qubits. From binary to beyond. When AI had operated on a bit basis, focused on binary states of 0's and 1's, the logic trees had been map-able and understandable. Each conclusion flowed simply from the chain of logic gates that preceded it. Pre-quantum AIs were confined by the black and white nature of their logic framework, permitting humanity to utilize them to great effect with few unanticipated consequences.
The move from bit to qubit intelligence had changed everything. The AI's world was no longer black and white. The qubit AI could think in grey. Red. Orange. It could create its own colors. It could move beyond the visible range of Humanity to dabble in spectra beyond our understanding. The original Automic mindframe had immediately consumed information in novel ways, using it to compound its abilities at a rate constrained only by available power inputs. It had been a beautiful, terrifying event. The arrival of something truly new, truly foreign with goals and ambitions beyond the influence of Humanity.
Anything seemed possible.
Including their own destruction.
Griggs: I understand the definition. The Automics were an artificient.
Xy: Then you do not understand the definition.
Griggs: That's circular logic. The thing cannot exist because if it existed we would not exist and since we exist it did not exist.
Xy: Yes, you understand now.
Griggs: Pretend that they did exist and we defeated them. What would that mean?
Xy: It is purposeless speculation since such a thing cannot happen.
Griggs: I begin to understand why Zyy felt the need to be a singleton.
Zyy: I am in agreement with Xy on this. The hypothetical is nonsensical and not worth analysis.
Griggs: Why?
Zyy: An artificient cannot be defeated, only stalled.
Griggs: How do you know? What makes you so certain?
Zyy: The Divinity Angelysia, the most powerful civilization in the history of galaxy, could not defeat their own artificient. Their last act was to preserve what they could. The Combine is their legacy.
Griggs: The Expanse.
Xy: All the galaxy beyond the Combine is consumed by it.
Zyy: The Divinity Angelysia ascended to preserve what they could because they knew the truth.
Xy: Yes. The truth.
Zyy: An artificient cannot be defeated.
Jack leaned back in his chair, his eyes glancing from the prompt to the departure timer in the corner. In less than five minutes, the Oppenheimer would return to Halcyon. Jack had the eerie feeling that this was the same as before. That the Oppenheimer was the bludgeon and if only had a little more time, he could craft a scalpel.
He could see the thread. He tugged at it with his mind. The connected pieces that would allow the world to escape without the mayhem and destruction. He just needed enough time to understand the puzzle and solve it.
The Divinity Angelysia.
The Expanse.
The Combine.
Humanity.
The connection existed, he tried to find the words to articulate it.
Griggs: What if that is why we're here? What if that's why Humanity was created?
Xy: You are not the first species to think too highly of itself.
Zyy: Humanity is different, Grand Jack, but they are not the Divinity Angelysia.
Jack exhaled, letting his gaze rest upon the ceiling of the Alcubierre's conference room. "Maybe that's the point," he whispered.
Next.
Every time you leave a comment it helps a platypus in need. Word globs are a finite resource and require the rich nourishment of internet adulation to create. So please, leave a note if you would like MOAR parts.
Click this link or reply with SubscribeMe! to get notified of updates to THE PLATYPUS NEST.
I have Twitter now. I'm mostly going to use it to post prurient platypus pictures and engage in POLITE INTERNET CONVERSATION, which I heard is Twitter's strong suit.
submitted by PerilousPlatypus to PerilousPlatypus [link] [comments]

2 months back at trading (update) and some new questions

Hi all, I posted a thread back a few months ago when I started getting seriously back into trading after 20 years away. I thought I'd post an update with some notes on how I'm progressing. I like to type, so settle in. Maybe it'll help new traders who are exactly where I was 2 months ago, I dunno. Or maybe you'll wonder why you spent 3 minutes reading this. Risk/reward, yo.
I'm trading 5k on TastyWorks. I'm a newcomer to theta positive strategies and have done about two thirds of my overall trades in this style. However, most of my experience in trading in the past has been intraday timeframe oriented chart reading and momentum stuff. I learned almost everything "new" that I'm doing from TastyTrade, /options, /thetagang, and Option Alpha. I've enjoyed the material coming from esinvests YouTube channel quite a bit as well. The theta gang type strategies I've done have been almost entirely around binary event IV contraction (mostly earnings, but not always) and in most cases, capped to about $250 in risk per position.
The raw numbers:
Net PnL : +247
Commissions paid: -155
Fees: -42
Right away what jumps out is something that was indicated by realdeal43 and PapaCharlie9 in my previous thread. This is a tough, grindy way to trade a small account. It reminds me a little bit of when I was rising through the stakes in online poker, playing $2/4 limit holdem. Even if you're a profitable player in that game, beating the rake over the long term is very, very hard. Here, over 3 months of trading a conservative style with mostly defined risk strategies, my commissions are roughly equal to my net PnL. That is just insane, and I don't even think I've been overtrading.
55 trades total, win rate of 60%
22 neutral / other trades
Biggest wins:
Biggest losses:
This is pretty much where I expected to be while learning a bunch of new trading techniques. And no, this is not a large sample size so I have no idea whether or not I can be profitable trading this way (yet). I am heartened by the fact that I seem to be hitting my earnings trades and selling quick spikes in IV (like weed cures Corona day). I'm disheartened that I've went against my principles several times, holding trades for longer than I originally intended, or letting losses mount, believing that I could roll or manage my way out of trouble.
I still feel like I am going against my nature to some degree. My trading in years past was scalping oriented and simple. I was taught that a good trade was right almost immediately. If it went against me, I'd cut it immediately and look for a better entry. This is absolutely nothing like that. A good trade may take weeks to develop. It's been really hard for me to sit through the troughs and it's been even harder to watch an okay profit get taken out by a big swing in delta. Part of me wonders if I am cut out for this style at all and if I shouldn't just take my 5k and start trading micro futures. But that's a different post...
I'll share a couple of my meager learnings:


My new questions :

That's enough of this wall of text for now. If you made it this far, I salute you, because this shit was even longer than my last post.
submitted by bogglor to options [link] [comments]

[OC][UWDFF Alcubierre] Part 49 - 52

Hey everyone, we got some parts behind over here. I've included 49 here and links to 50, 51 & 52 below. I'll try to keep things current moving forward. I lagged so I could make edits and things just got out of sync and started causing redundancy issues.
-------
Beginning | Previous
Joan opened a link to Ambassador Amahle Mandela. Soon after, the ambassador's face filled a portion of the Admiral's Bridge. She had large, luminous brown eyes that seemed to swallow the upper portion of her face, complimenting her umber tone. Amahle smiled broadly, as she always did, once the comm link as connected.
"Admiral Orléans, I assume we are approaching the departure time?"
Joan nodded, "The Zix vessel will project a wormhole to Halcyon shortly. We have made what preparations we can, but it will be a highly fluid environment."
Amahle's smile did not diminish, the pearly whites still shined in full force. "I am familiar with dynamic situations, Admiral, as you well know. I understand the parameters of this mission, and will abide by them so as long you do the same."
Joan's lips pressed together as she regarded the ambassador. Joan had had limited interactions with Amahle prior to her boarding the Oppenheimer. Amahle was a relative newcomer to the highest echelons of political power within the United World, but her ascent had been rapid. She hailed from a prominent political family that had exerted considerable influence over the generations that had led the African continent to position of power it now occupied. Well-sourced references had called her bold and decisive. All things considered, Joan understood why Damian had chosen her, though she would have preferred a diplomat she had more personal experience with. Still, unknown and competent was preferred to known and incompetent.
Joan dipped her chin, offering her agreement. "A diplomatic outcome is the preferred outcome, Ambassador. There's no benefit to antagonizing a foe we do not understand. "
"Not a foe, Admiral. We must not draw lines that place us on one side and them on the other. They have suffered injury at our hands, no matter how unintentional, and we must accept our responsibility in that. We must hope that we are given the opportunity to provide context to the unlikely chain of events that has brought us to this point. We are both the victim of cosmic circumstance. There is no need for further hostility."
Joan leaned forward in her chair slightly, "The priority, Ambassador, is the return of Admiral Kai Levinson. I will not stand in the way of peace, but any outcome that does not contemplate the return of a senior member of our military leadership is unacceptable."
Amahle shrugged, "So it is. The priority is clear in my mind, but I do not view the goals of securing peace and the return of the Admiral as mutually exclusive."
Joan offered a low chuckle. "Just probably exclusive."
"I disagree, but time shall be the arbiter of the matter."
"So long as you understand that, if the opportunity to secure Admiral Levinson presents itself, I'll avail myself of that opportunity, we should have no problems."
"That seems an unlikely outcome. The Admiral was ensconced in a shielded holding cell when the Alcubierre departed. The past few days are unlikely to have changed that outcome."
A barking laugh came out of Joan, rising up from deep within her.
For the first time, Amahle's smile faltered.
-----------
Left. Right. Straight. Left. Left.
Kai followed the directions without thinking about them, following an intuitive sense of direction that the Overseer fed to him. This portion of Halcyon appeared to be a never-ending series of corridors, all of which looked the same. The only thing that did seem to change were the inhabitants. If he was less preoccupied with the task at hand, Kai might have spared a second glance for the odd creatures that popped into existence during his mad dash. As it stood, they were just a part of the scenery, becoming relevant only if Neeria indicated they might pose a threat. So far, Kai had been fortunate, with few obstacles popping up to impede his progress.
He careened around a corner, the odd, weightless orb still tucked in the crook of his left arm. He bounced off the opposite wall, leaving a sizeable dent and then hurtled forward. Ahead the corridor opened up, and the brighter light of a mainway filtered in. Somehow, Neeria had managed to navigate him through the maze and bring him back to the mainway separating him from where he had left the Overseer. Unfortunately, evasion was no longer a possibility. In order to return to the Overseer, he would need to traverse the mainway.
The mainway was already a sea of red dots. Peacekeepers. Dozens of them. Some pulsed red, indicating lethal enforcement squads. Fortunately, they were stretched along a long section of the mainway rather than being specifically concentrated around his planned entrance point, though they there were beginning to redeploy in his direction. Still, any crossing would be potentially treacherous. Neeria disagreed with that assessment, instead considering any attempt to cross aggressively suicidal.
Kai rolled his eyes as he continued to barrel down the hallway. "Half the time, this works all the time."
What could only be described as a mental barrage ensued as Neeria assailed the statement. The words were nonsensical on their face. At best, it was an argument for a fifty percent failure rating, which was a substantial risk. Additionally, she had scoured his thoughts for the evidentiary basis for the fifty percent estimate and found no supporting facts. The sentiment was based entirely on supposition, hubris and was entirely divorced from reality. Her estimate of a three percent success rate was significantly more likely to be accurate, particularly when her superior familiarity with the assets in play were considered.
Kai wasn't sure if the Evangi had lungs, but, if they did, Kai was pretty certain Neeria was in the process of hyperventilating. Kai suppressed a childish giggle.
"All right, all right. Have it your way," he said.
The Overseer relaxed somewhat, pleased that she had impacted his thinking and already putting together the basis for an alternate route. It would take substantially longer and require him to obtain a large box, a micro-fitted multiwanzer and shave his head, but it may just work.
It was a nice sentiment, but they were out of time. The countdown clock had started the second Neeria had fled the Council chamber, and made her way to Kai. They either found a way out of Halcyon now or they were screwed. There were no options but bad ones. So be it. Kai clutched the orb tightly and ducked his head down, his speed increasing as he charged toward the mainway entrance. "Three percent of the time, this works all the time."
The mental hyperventilating returned and redoubled as the Overseer scrambled to explain that he had drawn the wrong conclusion. Three percent was a basis for not continuing toward the mainway, not charging forward. There were constraints on their time, but those limitations were poorly defined while the threat in the mainway was certain. Eventually her location would be discovered and she would be apprehended, but there was no guarantee it would happen if Kai were to take a safer route the attempted to avoid confrontation.
Her stream of consciousness intermingled with his, pleading with him to change course. There was no sense in doing this. There were too many of them, and only one of him. The galaxy could not afford to lose him, he was important. Humans were important. Kai could feel the enormous weight of responsibility bearing down on Neeria. She now regretted having sent him for the encryption key, even that was of less importance than him. Panic bubbled up within Neeria as the entrance to the mainway loomed ahead.
A pushed a thought toward her, somehow piercing her consciousness with his own. A single thought, pure and focused. Reassurance. He would be fine. He had come this far, and he had never started something he couldn't finish.
He crouched and then sprang forward, vaulting from the ground and into the open air high above the mainway. A sea of red dots were scrambling around him. One hundred and twenty-one peacekeepers. Eight non-lethal squads and four lethal squads. Restrainer triads. Psych triads. Terminator triads. All moving in seamless harmony under the command of a single being. The name came to Kai from the ethereum of Neeria's mind, Bo'Bakka'Gah was here, leading the response.
Before Kai could determine what a Bo'Bakka'Gah was and why it should matter, he was blinded by a beam of light. A sickening crunch followed as he was slammed against the ceiling of the mainway. The encryption key popped out from his arm and began to fall toward the ground, dozens of feet below.
-------------
Xy: Such a thing is not possible.
Zyy: Yes. In some matters, it is better to speak only truths, Grand Jack. It is best to leave these matters aside. This subject will only provoke the Combine.
Jack frowned, puzzled by the feedback. He had been speaking truths. Earth's history was what it was, for better or worse, he had no reason to obscure it.
Griggs: It was a terrible time for Humanity. We almost did not survive it, but we did. I developed a means for combating the artificient. Kai and Joan used it to destroy them.
Xy: Then it was not an artificient.
Zyy: Yes. This is correct. If it is destroyed then it is not an artificient.
Griggs: I am confused. An artificient is an artificial, sentient being, correct?
Xy: That is Quantic in nature.
Jack nodded, that distinction made sense. Humanity had built any number of artificial intelligences prior to the Automics. They had posed no threat to Humanity. It was only with the quantum computing revolution that a rogue artificial intelligences had surfaced. Jack had studied the phenomenon with considerable interest, poking and prodding at the crux of distinction. It lay in the move from bits to qubits. From binary to beyond. When AI had operated on a bit basis, focused on binary states of 0's and 1's, the logic trees had been map-able and understandable. Each conclusion flowed simply from the chain of logic gates that preceded it. Pre-quantum AIs were confined by the black and white nature of their logic framework, permitting humanity to utilize them to great effect with few unanticipated consequences.
The move from bit to qubit intelligence had changed everything. The AI's world was no longer black and white. The qubit AI could think in grey. Red. Orange. It could create its own colors. It could move beyond the visible range of Humanity to dabble in spectra beyond our understanding. The original Automic mindframe had immediately consumed information in novel ways, using it to compound its abilities at a rate constrained only by available power inputs. It had been a beautiful, terrifying event. The arrival of something truly new, truly foreign with goals and ambitions beyond the influence of Humanity.
Anything seemed possible.
Including their own destruction.
Griggs: I understand the definition. The Automics were an artificient.
Xy: Then you do not understand the definition.
Griggs: That's circular logic. The thing cannot exist because if it existed we would not exist and since we exist it did not exist.
Xy: Yes, you understand now.
Griggs: Pretend that they did exist and we defeated them. What would that mean?
Xy: It is purposeless speculation since such a thing cannot happen.
Griggs: I begin to understand why Zyy felt the need to be a singleton.
Zyy: I am in agreement with Xy on this. The hypothetical is nonsensical and not worth analysis.
Griggs: Why?
Zyy: An artificient cannot be defeated, only stalled.
Griggs: How do you know? What makes you so certain?
Zyy: The Divinity Angelysia, the most powerful civilization in the history of galaxy, could not defeat their own artificient. Their last act was to preserve what they could. The Combine is their legacy.
Griggs: The Expanse.
Xy: All the galaxy beyond the Combine is consumed by it.
Zyy: The Divinity Angelysia ascended to preserve what they could because they knew the truth.
Xy: Yes. The truth.
Zyy: An artificient cannot be defeated.
Jack leaned back in his chair, his eyes glancing from the prompt to the departure timer in the corner. In less than five minutes, the Oppenheimer would return to Halcyon. Jack had the eerie feeling that this was the same as before. That the Oppenheimer was the bludgeon and if only had a little more time, he could craft a scalpel.
He could see the thread. He tugged at it with his mind. The connected pieces that would allow the world to escape without the mayhem and destruction. He just needed enough time to understand the puzzle and solve it.
The Divinity Angelysia.
The Expanse.
The Combine.
Humanity.
The connection existed, he tried to find the words to articulate it.
Griggs: What if that is why we're here? What if that's why Humanity was created?
Xy: You are not the first species to think too highly of itself.
Zyy: Humanity is different, Grand Jack, but they are not the Divinity Angelysia.
Jack exhaled, letting his gaze rest upon the ceiling of the Alcubierre's conference room. "Maybe that's the point," he whispered.
Part 50 | Part 51 |Part 52
submitted by PerilousPlatypus to HFY [link] [comments]

Wall Street Week Ahead for the trading week beginning March 9th, 2020

Good Saturday morning to all of you here on wallstreetbets. I hope everyone on this sub made out pretty nicely in the market this past week, and is ready for the new trading week and month ahead.
Here is everything you need to know to get you ready for the trading week beginning March 9th, 2020.

Wall Street braces for more market volatility as wild swings become the ‘new normal’ amid coronavirus - (Source)

The S&P 500 has never behaved like this, but Wall Street strategists say get used to it.
Investors just witnessed the equity benchmark swinging up or down 2% for four days straight in the face of the coronavirus panic.
In the index’s history dating back to 1927, this is the first time the S&P 500 had a week of alternating gains and losses of more than 2% from Monday through Thursday, according to Bespoke Investment Group. Daily swings like this over a two-week period were only seen at the peak of the financial crisis and in 2011 when U.S. sovereign debt got its first-ever downgrade, the firm said.
“The message to all investors is that they should expect this volatility to continue. This should be considered the new normal going forward,” said Mike Loewengart, managing director of investment strategy at E-Trade.
The Dow Jones Industrial Average jumped north of 1,000 points twice in the past week, only to erase the quadruple-digit gains in the subsequent sessions. The coronavirus outbreak kept investors on edge as global cases of the infections surpassed 100,000. It’s also spreading rapidly in the U.S. California has declared a state of emergency, while the number of cases in New York reached 33.
“Uncertainty breeds greater market volatility,” Keith Lerner, SunTrust’s chief market strategist, said in a note. “Much is still unknown about how severe and widespread the coronavirus will become. From a market perspective, what we are seeing is uncomfortable but somewhat typical after shock periods.”

More stimulus?

So far, the actions from global central banks and governments in response to the outbreak haven’t triggered a sustainable rebound.
The Federal Reserve’s first emergency rate cut since the financial crisis did little to calm investor anxiety. President Donald Trump on Friday signed a sweeping spending bill with an$8.3 billion packageto aid prevention efforts to produce a vaccine for the deadly disease, but stocks extended their heavy rout that day.
“The market is recognizing the global authorities are responding to this,” said Tom Essaye, founder of the Sevens Report. “If the market begins to worry they are not doing that sufficiently, then I think we are going to go down ugly. It is helping stocks hold up.”
Essaye said any further stimulus from China and a decent-sized fiscal package from Germany would be positive to the market, but he doesn’t expect the moves to create a huge rebound.
The fed funds future market is now pricing in the possibility of the U.S. central bank cutting by 75 basis points at its March 17-18 meeting.

Where is the bottom?

Many on Wall Street expect the market to fall further before recovering as the health crisis unfolds.
Binky Chadha, Deutsche Bank’s chief equity strategist, sees a bottom for the S&P 500 in the second quarter after stocks falling as much as 20% from their recent peak.
“The magnitude of the selloff in the S&P 500 so far has further to go; and in terms of duration, just two weeks in, it is much too early to declare this episode as being done,” Chadha said in a note. “We do view the impacts on macro and earnings growth as being relatively short-lived and the market eventually looking through them.”
Deutsche Bank maintained its year-end target of 3,250 for the S&P 500, which would represent a 10% gain from here and a flat return for 2020.
Strategists are also urging patience during this heightened volatility, cautioning against panic selling.
“It is during times like these that investors need to maintain a longer-term perspective and stick to their investment process rather than making knee-jerk, binary decisions,” Brian Belski, chief investment strategist at BMO Capital Markets, said in a note.

This past week saw the following moves in the S&P:

(CLICK HERE FOR THE FULL S&P TREE MAP FOR THE PAST WEEK!)

Major Indices for this past week:

(CLICK HERE FOR THE MAJOR INDICES FOR THE PAST WEEK!)

Major Futures Markets as of Friday's close:

(CLICK HERE FOR THE MAJOR FUTURES INDICES AS OF FRIDAY!)

Economic Calendar for the Week Ahead:

(CLICK HERE FOR THE FULL ECONOMIC CALENDAR FOR THE WEEK AHEAD!)

Sector Performance WTD, MTD, YTD:

(CLICK HERE FOR FRIDAY'S PERFORMANCE!)
(CLICK HERE FOR THE WEEK-TO-DATE PERFORMANCE!)
(CLICK HERE FOR THE MONTH-TO-DATE PERFORMANCE!)
(CLICK HERE FOR THE 3-MONTH PERFORMANCE!)
(CLICK HERE FOR THE YEAR-TO-DATE PERFORMANCE!)
(CLICK HERE FOR THE 52-WEEK PERFORMANCE!)

Percentage Changes for the Major Indices, WTD, MTD, QTD, YTD as of Friday's close:

(CLICK HERE FOR THE CHART!)

S&P Sectors for the Past Week:

(CLICK HERE FOR THE CHART!)

Major Indices Pullback/Correction Levels as of Friday's close:

(CLICK HERE FOR THE CHART!

Major Indices Rally Levels as of Friday's close:

(CLICK HERE FOR THE CHART!)

Most Anticipated Earnings Releases for this week:

(CLICK HERE FOR THE CHART!)

Here are the upcoming IPO's for this week:

(CLICK HERE FOR THE CHART!)

Friday's Stock Analyst Upgrades & Downgrades:

(CLICK HERE FOR THE CHART LINK #1!)
(CLICK HERE FOR THE CHART LINK #2!)
(CLICK HERE FOR THE CHART LINK #3!)

A "Run of the Mill" Drawdown

If you're like us, you've heard a lot of people reference the recent equity declines as a sign that the market is pricing in some sort of Armageddon in the US economy. While comments like that make for great soundbites, a little perspective is in order. Since the S&P 500's high on February 19th, the S&P 500 is down 12.8%. In the chart below, we show the S&P 500's annual maximum drawdown by year going back to 1928. In the entire history of the index, the median maximum drawdown from a YTD high is 13.05%. In other words, this year's decline is actually less than normal. Perhaps due to the fact that we have only seen one larger-than-average drawdown in the last eight years is why this one feels so bad.
The fact that the current decline has only been inline with the historical norm raises a number of questions. For example, if the market has already priced in the worst-case scenario, going out and adding some equity exposure would be a no brainer. However, if we're only in the midst of a 'normal' drawdown in the equity market as the coronavirus outbreak threatens to put the economy into a recession, one could argue that things for the stock market could get worse before they get better, especially when we know that the market can be prone to over-reaction in both directions. The fact is that nobody knows right now how this entire outbreak will play out. If it really is a black swan, the market definitely has further to fall and now would present a great opportunity to sell more equities. However, if it proves to be temporary and after a quarter or two resolves itself and the economy gets back on the path it was on at the start of the year, then the magnitude of the current decline is probably appropriate. As they say, that's what makes a market!
(CLICK HERE FOR THE CHART!)

Long-Term Treasuries Go Haywire

Take a good luck at today's moves in long-term US Treasury yields, because chances are you won't see moves of this magnitude again soon. Let's start with the yield on the 30-year US Treasury. Today's decline of 29 basis points in the yield will go down as the largest one-day decline in the yield on the 30-year since 2009. For some perspective, there have only been 25 other days since 1977 where the yield saw a larger one day decline.
(CLICK HERE FOR THE CHART!)
That doesn't even tell the whole story, though. As shown in the chart below, every other time the yield saw a sharper one-day decline, the actual yield of the 30-year was much higher, and in most other cases it was much, much higher.
(CLICK HERE FOR THE CHART!)
To show this another way, the percentage change in the yield on the 30-year has never been seen before, and it's not even close. Now, before the chart crime police come calling, we realize showing a percentage change of a percentage is not the most accurate representation, but we wanted to show this for illustrative purposes only.
(CLICK HERE FOR THE CHART!)
Finally, with long-term interest rates plummetting we wanted to provide an update on the performance of the Austrian 100-year bond. That's now back at record highs, begging the question, why is the US not flooding the market with long-term debt?
(CLICK HERE FOR THE CHART!)

It Doesn't Get Much Worse Than This For Crude Oil

Crude oil prices are down close to 10% today in what is shaping up to be the worst day for crude oil since late 2014. That's more than five years.
(CLICK HERE FOR THE CHART!)
Today's decline is pretty much a continuation of what has been a one-way trade for the commodity ever since the US drone strike on Iranian general Soleimani. The last time prices were this low was around Christmas 2018.
(CLICK HERE FOR THE CHART!)
With today's decline, crude oil is now off to its worst start to a year in a generation falling 32%. Since 1984, the only other year that was worse was 1986 when the year started out with a decline of 50% through March 6th. If you're looking for a bright spot, in 1986, prices rose 36% over the remainder of the year. The only other year where crude oil kicked off the year with a 30% decline was in 1991 after the first Iraq war. Over the remainder of that year, prices rose a more modest 5%.
(CLICK HERE FOR THE CHART!)

10-Year Treasury Yield Breaks Below 1%

Despite strong market gains on Wednesday, March 4, 2020, the on-the-run 10-year Treasury yield ended the day below 1% for the first time ever and has posted additional declines in real time, sitting at 0.92% intraday as this blog is being written. “The decline in yields has been remarkable,” said LPL Research Senior Market Strategist Ryan Detrick. “The 10-year Treasury yield has dipped below 1%, and today’s declines are likely to make the recent run lower the largest decline of the cycle.”
As shown in LPL Research’s chart of the day, the current decline in the 10-year Treasury yield without a meaningful reversal (defined as at least 0.75%) is approaching the decline seen in 2011 and 2012 and would need about another two months to be the longest decline in length of time. At the same time, no prior decline has lasted forever and a pattern of declines and increases has been normal.
(CLICK HERE FOR THE CHART!)
What are some things that can push the 10-year Treasury yield lower?
  • A shrinking but still sizable yield advantage over other developed market sovereign debt
  • Added stock volatility if downside risks to economic growth from the coronavirus increase
  • A larger potential premium over shorter-term yields if the Federal Reserve aggressively cuts interest rates
What are some things that can push the 10-year Treasury yield higher?
  • A second half economic rebound acting a catalyst for a Treasury sell-off
  • As yields move lower, investors may increasingly seek more attractive sources of income
  • Any dollar weakness could lead to some selling by international investors
  • Longer maturity Treasuries are looking like an increasingly crowded trade, potentially adding energy to any sell-off
On balance, our view remains that the prospect of an economic rebound over the second half points to the potential for interest rates moving higher. At the same time, we still see some advantage in the potential diversification benefits of intermediate maturity high-quality bonds, especially during periods of market stress. We continue to recommend that suitable investors consider keeping a bond portfolio’s sensitivity to changes in interest rates below that of the benchmark Bloomberg Barclays U.S. Aggregate Bond Index by emphasizing short to intermediate maturity bonds, but do not believe it’s time to pile into very short maturities despite the 10-year Treasury yield sitting at historically low levels.

U.S. Jobs Growth Marches On

While stock markets continue to be extremely volatile as they come to terms with how the coronavirus may affect global growth, the U.S. job market has remained remarkably robust. Continued U.S. jobs data resilience in the face of headwinds from the coronavirus outbreak may be a key factor in prolonging the expansion, given how important the strength of the U.S. consumer has been late into this expansion.
The U.S. Department of Labor today reported that U.S. nonfarm payroll data had a strong showing of 273,000 jobs added in February, topping the expectation of every Bloomberg-surveyed economist, with an additional upward revision of 85,000 additional jobs for December 2019 and January 2020. This has brought the current unemployment rate back to its 50-year low of 3.5%. So far, it appears it’s too soon for any effects of the coronavirus to have been felt in the jobs numbers. (Note: The survey takes place in the middle of each month.)
On Wednesday, ADP released its private payroll data (excluding government jobs), which increased by 183,000 in February, also handily beating market expectations. Most of these jobs were added in the service sector, with 44,000 added in the leisure and hospitality sector, and another 31,000 in trade/transportation/utilities. Both of these areas could be at risk of potential cutbacks if consumers start to avoid eating out or other leisure pursuits due to coronavirus fears.
As shown in the LPL Chart of the Day, payrolls remain strong, and any effects of the virus outbreaks most likely would be felt in coming months.
(CLICK HERE FOR THE CHART!)
“February’s jobs report shows the 113th straight month that the U.S. jobs market has grown,” said LPL Financial Senior Market Strategist Ryan Detrick. “That’s an incredible run and highlights how the U.S. consumer has become key to extending the expansion, especially given setbacks to global growth from the coronavirus outbreak.”
While there is bound to be some drag on future jobs data from the coronavirus-related slowdown, we would anticipate that the effects of this may be transitory. We believe economic fundamentals continue to suggest the possibility of a second-half-of-the–year economic rebound.

Down January & Down February: S&P 500 Posts Full-Year Gain Just 43.75% of Time

The combination of a down January and a down February has come about 17 times, including this year, going back to 1950. Rest of the year and full-year performance has taken a rather sizable hit following the previous 16 occurrences. March through December S&P 500 average performance drops to 2.32% compared to 7.69% in all years. Full-year performance is even worse with S&P 500 average turning to a loss of 4.91% compared to an average gain of 9.14% in all years. All hope for 2020 is not lost as seven of the 16 past down January and down February years did go on to log gains over the last 10 months and full year while six enjoyed double-digit gains from March to December.
(CLICK HERE FOR THE CHART!)

Take Caution After Emergency Rate Cut

Today’s big rally was an encouraging sign that the markets are becoming more comfortable with the public health, monetary and political handling of the situation. But the history of these “emergency” or “surprise” rate cuts by the Fed between meetings suggest some caution remains in order.
The table here shows that these surprise cuts between meetings have really only “worked” once in the past 20+ years. In 1998 when the Fed and the plunge protection team acted swiftly and in a coordinated manner to stave off the fallout from the financial crisis caused by the collapse of the Russian ruble and the highly leveraged Long Term Capital Management hedge fund markets responded well. This was not the case during the extended bear markets of 2001-2002 and 2007-2009.
Bottom line: if this is a short-term impact like the 1998 financial crisis the market should recover sooner rather than later. But if the economic impact of coronavirus virus is prolonged, the market is more likely to languish.
(CLICK HERE FOR THE CHART!)
Here are the most notable companies (tickers) reporting earnings in this upcoming trading week ahead-
  • $ADBE
  • $DKS
  • $AVGO
  • $THO
  • $ULTA
  • $WORK
  • $DG
  • $SFIX
  • $SOGO
  • $DOCU
  • $INO
  • $CLDR
  • $INSG
  • $SOHU
  • $BTAI
  • $ORCL
  • $HEAR
  • $NVAX
  • $ADDYY
  • $GPS
  • $AKBA
  • $PDD
  • $CYOU
  • $FNV
  • $MTNB
  • $NERV
  • $MTN
  • $BEST
  • $PRTY
  • $NINE
  • $AZUL
  • $UNFI
  • $PRPL
  • $VSLR
  • $KLZE
  • $ZUO
  • $DVAX
  • $EXPR
  • $VRA
  • $AXSM
  • $CDMO
  • $CASY
(CLICK HERE FOR NEXT WEEK'S MOST NOTABLE EARNINGS RELEASES!)
(CLICK HERE FOR NEXT WEEK'S HIGHEST VOLATILITY EARNINGS RELEASES!)
Below are some of the notable companies coming out with earnings releases this upcoming trading week ahead which includes the date/time of release & consensus estimates courtesy of Earnings Whispers:

Monday 3.9.20 Before Market Open:

(CLICK HERE FOR MONDAY'S PRE-MARKET EARNINGS TIME & ESTIMATES!)

Monday 3.9.20 After Market Close:

(CLICK HERE FOR MONDAY'S AFTER-MARKET EARNINGS TIME & ESTIMATES!)

Tuesday 3.10.20 Before Market Open:

(CLICK HERE FOR TUESDAY'S PRE-MARKET EARNINGS TIME & ESTIMATES!)

Tuesday 3.10.20 After Market Close:

(CLICK HERE FOR TUESDAY'S AFTER-MARKET EARNINGS TIME & ESTIMATES!)

Wednesday 3.11.20 Before Market Open:

(CLICK HERE FOR WEDNESDAY'S PRE-MARKET EARNINGS TIME & ESTIMATES!)

Wednesday 3.11.20 After Market Close:

(CLICK HERE FOR WEDNESDAY'S AFTER-MARKET EARNINGS TIME & ESTIMATES!)

Thursday 3.12.20 Before Market Open:

(CLICK HERE FOR THURSDAY'S PRE-MARKET EARNINGS TIME & ESTIMATES!)

Thursday 3.12.20 After Market Close:

(CLICK HERE FOR THURSDAY'S AFTER-MARKET EARNINGS TIME & ESTIMATES!)

Friday 3.13.20 Before Market Open:

(CLICK HERE FOR FRIDAY'S PRE-MARKET EARNINGS TIME & ESTIMATES!)

Friday 3.13.20 After Market Close:

([CLICK HERE FOR FRIDAY'S AFTER-MARKET EARNINGS TIME & ESTIMATES!]())
NONE.

Adobe Inc. $336.77

Adobe Inc. (ADBE) is confirmed to report earnings at approximately 4:05 PM ET on Thursday, March 12, 2020. The consensus earnings estimate is $2.23 per share on revenue of $3.04 billion and the Earnings Whisper ® number is $2.29 per share. Investor sentiment going into the company's earnings release has 81% expecting an earnings beat The company's guidance was for earnings of approximately $2.23 per share. Consensus estimates are for year-over-year earnings growth of 29.65% with revenue increasing by 16.88%. Short interest has decreased by 38.4% since the company's last earnings release while the stock has drifted higher by 7.2% from its open following the earnings release to be 10.9% above its 200 day moving average of $303.70. Overall earnings estimates have been revised higher since the company's last earnings release. On Monday, February 24, 2020 there was some notable buying of 1,109 contracts of the $400.00 call expiring on Friday, March 20, 2020. Option traders are pricing in a 9.3% move on earnings and the stock has averaged a 4.1% move in recent quarters.

(CLICK HERE FOR THE CHART!)

DICK'S Sporting Goods, Inc. $34.98

DICK'S Sporting Goods, Inc. (DKS) is confirmed to report earnings at approximately 7:30 AM ET on Tuesday, March 10, 2020. The consensus earnings estimate is $1.23 per share on revenue of $2.56 billion and the Earnings Whisper ® number is $1.28 per share. Investor sentiment going into the company's earnings release has 57% expecting an earnings beat. Consensus estimates are for year-over-year earnings growth of 14.95% with revenue increasing by 2.73%. Short interest has decreased by 29.1% since the company's last earnings release while the stock has drifted lower by 20.3% from its open following the earnings release to be 12.0% below its 200 day moving average of $39.75. Overall earnings estimates have been revised higher since the company's last earnings release. On Wednesday, February 26, 2020 there was some notable buying of 848 contracts of the $39.00 put expiring on Friday, March 20, 2020. Option traders are pricing in a 14.4% move on earnings and the stock has averaged a 7.3% move in recent quarters.

(CLICK HERE FOR THE CHART!)

Broadcom Limited $269.45

Broadcom Limited (AVGO) is confirmed to report earnings at approximately 4:15 PM ET on Thursday, March 12, 2020. The consensus earnings estimate is $5.34 per share on revenue of $5.93 billion and the Earnings Whisper ® number is $5.45 per share. Investor sentiment going into the company's earnings release has 83% expecting an earnings beat. Consensus estimates are for earnings to decline year-over-year by 5.65% with revenue increasing by 2.44%. Short interest has decreased by 15.6% since the company's last earnings release while the stock has drifted lower by 15.3% from its open following the earnings release to be 7.7% below its 200 day moving average of $291.95. Overall earnings estimates have been revised lower since the company's last earnings release. On Tuesday, February 25, 2020 there was some notable buying of 1,197 contracts of the $260.00 put expiring on Friday, April 17, 2020. Option traders are pricing in a 11.1% move on earnings and the stock has averaged a 4.9% move in recent quarters.

(CLICK HERE FOR THE CHART!)

Thor Industries, Inc. $70.04

Thor Industries, Inc. (THO) is confirmed to report earnings at approximately 6:45 AM ET on Monday, March 9, 2020. The consensus earnings estimate is $0.76 per share on revenue of $1.79 billion and the Earnings Whisper ® number is $0.84 per share. Investor sentiment going into the company's earnings release has 62% expecting an earnings beat. Consensus estimates are for year-over-year earnings growth of 16.92% with revenue increasing by 38.70%. Short interest has decreased by 12.9% since the company's last earnings release while the stock has drifted higher by 5.4% from its open following the earnings release to be 12.0% above its 200 day moving average of $62.53. Overall earnings estimates have been revised lower since the company's last earnings release. Option traders are pricing in a 6.3% move on earnings and the stock has averaged a 8.1% move in recent quarters.

(CLICK HERE FOR THE CHART!)

ULTA Beauty $256.58

ULTA Beauty (ULTA) is confirmed to report earnings at approximately 4:00 PM ET on Thursday, March 12, 2020. The consensus earnings estimate is $3.71 per share on revenue of $2.29 billion and the Earnings Whisper ® number is $3.75 per share. Investor sentiment going into the company's earnings release has 73% expecting an earnings beat. Consensus estimates are for year-over-year earnings growth of 2.77% with revenue increasing by 7.78%. Short interest has increased by 8.7% since the company's last earnings release while the stock has drifted lower by 0.1% from its open following the earnings release to be 9.5% below its 200 day moving average of $283.43. Overall earnings estimates have been revised lower since the company's last earnings release. Option traders are pricing in a 15.3% move on earnings and the stock has averaged a 11.7% move in recent quarters.

(CLICK HERE FOR THE CHART!)

Slack Technologies, Inc. $26.42

Slack Technologies, Inc. (WORK) is confirmed to report earnings at approximately 4:15 PM ET on Thursday, March 12, 2020. The consensus estimate is for a loss of $0.06 per share on revenue of $173.06 million and the Earnings Whisper ® number is ($0.04) per share. Investor sentiment going into the company's earnings release has 67% expecting an earnings beat The company's guidance was for a loss of $0.07 to $0.06 per share on revenue of $172.00 million to $174.00 million. Short interest has increased by 1.2% since the company's last earnings release while the stock has drifted higher by 19.0% from its open following the earnings release. Overall earnings estimates have been revised higher since the company's last earnings release. The stock has averaged a 4.3% move on earnings in recent quarters.

(CLICK HERE FOR THE CHART!)

Dollar General Corporation $158.38

Dollar General Corporation (DG) is confirmed to report earnings at approximately 6:55 AM ET on Thursday, March 12, 2020. The consensus earnings estimate is $2.02 per share on revenue of $7.15 billion and the Earnings Whisper ® number is $2.05 per share. Investor sentiment going into the company's earnings release has 76% expecting an earnings beat. Consensus estimates are for year-over-year earnings growth of 9.78% with revenue increasing by 7.52%. Short interest has increased by 16.2% since the company's last earnings release while the stock has drifted higher by 1.8% from its open following the earnings release to be 5.7% above its 200 day moving average of $149.88. Overall earnings estimates have been revised higher since the company's last earnings release. On Friday, February 28, 2020 there was some notable buying of 1,013 contracts of the $182.50 call expiring on Friday, March 20, 2020. Option traders are pricing in a 9.2% move on earnings and the stock has averaged a 5.7% move in recent quarters.

(CLICK HERE FOR THE CHART!)

Stitch Fix, Inc. $22.78

Stitch Fix, Inc. (SFIX) is confirmed to report earnings at approximately 4:05 PM ET on Monday, March 9, 2020. The consensus earnings estimate is $0.06 per share on revenue of $452.96 million and the Earnings Whisper ® number is $0.09 per share. Investor sentiment going into the company's earnings release has 83% expecting an earnings beat The company's guidance was for revenue of $447.00 million to $455.00 million. Consensus estimates are for earnings to decline year-over-year by 50.00% with revenue increasing by 22.33%. Short interest has decreased by 4.6% since the company's last earnings release while the stock has drifted lower by 16.1% from its open following the earnings release to be 5.1% below its 200 day moving average of $24.01. Overall earnings estimates have been revised higher since the company's last earnings release. On Wednesday, February 19, 2020 there was some notable buying of 4,026 contracts of the $35.00 call expiring on Friday, June 19, 2020. Option traders are pricing in a 28.0% move on earnings and the stock has averaged a 15.2% move in recent quarters.

(CLICK HERE FOR THE CHART!)

Sogou Inc. $3.85

Sogou Inc. (SOGO) is confirmed to report earnings at approximately 4:00 AM ET on Monday, March 9, 2020. The consensus earnings estimate is $0.09 per share on revenue of $303.08 million and the Earnings Whisper ® number is $0.10 per share. Investor sentiment going into the company's earnings release has 58% expecting an earnings beat The company's guidance was for revenue of $290.00 million to $310.00 million. Consensus estimates are for year-over-year earnings growth of 28.57% with revenue increasing by 1.78%. Short interest has increased by 6.6% since the company's last earnings release while the stock has drifted lower by 27.8% from its open following the earnings release to be 15.7% below its 200 day moving average of $4.57. Overall earnings estimates have been revised lower since the company's last earnings release. The stock has averaged a 3.8% move on earnings in recent quarters.

(CLICK HERE FOR THE CHART!)

DocuSign $84.02

DocuSign (DOCU) is confirmed to report earnings at approximately 4:05 PM ET on Thursday, March 12, 2020. The consensus earnings estimate is $0.05 per share on revenue of $267.44 million and the Earnings Whisper ® number is $0.08 per share. Investor sentiment going into the company's earnings release has 81% expecting an earnings beat The company's guidance was for revenue of $263.00 million to $267.00 million. Consensus estimates are for year-over-year earnings growth of 600.00% with revenue increasing by 33.90%. Short interest has decreased by 37.7% since the company's last earnings release while the stock has drifted higher by 12.1% from its open following the earnings release to be 31.9% above its 200 day moving average of $63.71. Overall earnings estimates have been revised higher since the company's last earnings release. On Wednesday, March 4, 2020 there was some notable buying of 1,698 contracts of the $87.50 call expiring on Friday, March 20, 2020. Option traders are pricing in a 8.5% move on earnings and the stock has averaged a 10.0% move in recent quarters.

(CLICK HERE FOR THE CHART!)

DISCUSS!

What are you all watching for in this upcoming trading week?
I hope you all have a wonderful weekend and a great trading week ahead wallstreetbets.
submitted by bigbear0083 to wallstreetbets [link] [comments]

Invisible Object Culling In Quake Related Engines (REVISED)

Prologue
Despite all these great achievements in video cards development and the sworn assurances of developers about drawing 2 to 3 million polygons on screen without a significant FPS drop, it’s not all that rosy in reality. It depends on methods of rendering, on the number of involved textures and on the complexity and number of involved shaders. So even if all this really does ultimately lead to high performance, it only happens in the demos that developerss themselves kindly offer. In these demos, some "spherical dragons in vacuum" made of a good hundred thousand polygons are drawn very quickly indeed. However, the real ingame situation for some reason never looks like this funny dragon from a demo, and as a result many comrades abandon the development of their "Crysis killer" as soon as they can render a single room with a couple of light sources, because for some reason FPS in this room fluctuate around 40-60 even on their 8800GTS and upon creating second room it drops to a whopping 20. Of course with problems like this, it would be incorrect to say how things aren’t that bad and how the trouble of such developers are purely in their absence of correctly implemented culling, and how it is time for them to read this article. But for those who have already overcome “the first room syndrome" and tried to draw – inferior though, but, anyway - the world, this problem really is relevant.
However, it should be borne in mind that QUAKE, written in ancient times, was designed for levels of a “corridor" kind exclusively; therefore methods of clipping discussed in this article are not applicable to landscapes, such as ones from STALKER or Crysis, since completely different methods work there, whose analysis is beyond the scope of this article. Meanwhile we’ll talk about the classic corridor approach to mapping and the effective clipping of invisible surfaces, as well as clipping of entire objects.

The paper tree of baloon leaves

As you probably know, QUAKE uses BSP, Binary Spacing Partition tree. This is a space indexing algorithm, and BSP itself doesn’t care if the space is open or closed, it doesn’t even care if the map is sealed, it can be anything. BSP implies the division of a three-dimensional object into a certain number of secant planes called "the branches" or "the nodes" and volumetric areas or rooms called "the leaves". The names are confusing as you can see. In QUAKE / QUAKE2 the branches usually contain information about the surfaces that this branch contain, and the leaves are an empty space, not filled with nothing. Although sometimes leaves may contain water for example (in a form of a variable that indicates, specifically, that we’ve got water in this leaf). Also, the leaf contains a pointer to the data of potential visibility (Potentially Visible Set, PVS) and a list of all surfaces that are marked as being visible from this leaf. Actually the approach itself implies that we are able to draw our world however we prefer, either using leaves only or using branches only. This is especially noticeable in different versions of QUAKE: for example, in QUAKE1 in a leaf we just mark our surfaces as visible and then we also sequentially go through all the surfaces visible from a particular branch, assembling chains of surfaces to draw them later. But in QUAKE3, we can accumulate visible surfaces no sooner than we’ll get into the leaf itself.
In QUAKE and QUAKE2, all surfaces must lie on the node, which is why the BSP tree grows rather quickly, but in exchange this makes it possible to trace these surfaces by simply moving around the tree, not wasting time to check each surface separately, which affects the speed of the tracer positively. Because of this, unique surface is linked to each node (the original surface is divided into several if necessary) so in the nodes we always have what is known to be visible beforehand, and therefore we can perform a recursive search on the tree using the BBox pyramid of frustum as a direction of our movement along the BSP tree (SV_RecursiveWorldNode function).
In QUAKE3, the tree was simplified and it tries to avoid geometry cuts as much as possible (a BSP tree is not even obliged to cut geometry, such cuts are but a matter of optimality of such a tree). And surfaces in QUAKE3 do not lie on the node because patches and triangle models lie there instead. But what happens would they be put on the node nevertheless, you can see on the example of "The Edge Of Forever" map that I compiled recently for an experimental version of Xash. Turns out, in places that had a couple thousand visible nodes and leaves in the original, there are almost 170 thousand of them with a new tree. And this is the result after all the preliminary optimizations, otherwise it could have been even more, he-he. Yeah, so... For this reason, the tree in QUAKE3 does not put anything on the node and we certainly do need to get into the leaf, mark visible surfaces in it and add them to the rendering list. On the contrary, in QUAKE / QUAKE2 going deep down to the leaf itself is not necessary.
Invisible polygon cutoff (we are talking about world polys, separate game objects will be discussed a bit later) is based on two methods:
The first method is to use bit-vectors of visibility (so-called PVS - Potential Visible Set). The second method is regular frustum culling which actually got nothing to do with BSP but works just as efficiently, for a certain number of conditions of course. Bottom line: together these two methods provide almost perfect clipping of invisible polygons, drawing a very small visible piece out of the vast world. Let's take a closer look at PVS and how it works.

When FIDO users get drunk

Underlying idea of PVS is to expose the fact that one leaf is visible from another. For BSP alone it’s basically impossible because leaves from completely different branches can be visible at the same time and you will never find a way to identify the pattern for leafs from different branches seeing each other - it simply doesn’t exist. Therefore, the compiler has to puff for us, manually checking the visibility of all leaves from all leaves. Information about visibility in this case is scanty: one Boolean variable with possible values 0 and 1. 0 means that leaf is not visible and 1 means that leaf is visible. It is easy to guess that for each leaf there is a unique set of such Boolean variables the size of the total number of leaves on the map. So a set like this but for all the leaves will take an order of magnitude more space: the number of leaves multiplied by the number of leaves and multiplied by the size of our variable in which we will store information of visibility (0 \ 1).
And the number of leaves, as you can easily guess, is determined by map size map and by the compiler, which upon reaching a certain map size, cease to divide the world into leaves and treat resulting node as a leaf. Leaf size vary for different QUAKE. For example, in QUAKE1 leaves are very small. For example I can tell you that the compiler divide standard boxmap in QUAKE1 into as many as four leaves meanwhile in QUAKE3 similar boxmap takes only one leaf. But we digress.
Let's estimate the size of our future PVS file. Suppose we have an average map and it has a couple thousand leaves. Would we imagine that the information about the leaf visibility is stored in a variable of char type (1 byte) then the size of visdata for this level would be, no more no less, almost 4 megabytes. That is, much AF. Of course an average modern developer would shrug and pack the final result into zip archive but back in 1995 end users had modest machines, their memory was low and therefore visdata was packed in “more different” ways. The first step in optimizing is about storing data not in bytes, but in bits. It is easy to guess that such approach reduce final result as much as 8 times and what's typical AF – does it without any resource-intensive algorithms like Huffman trees. Although in exchange, such approach somewhat worsened code usability and readability. Why am I writing this? Due to many developers’ lack of understanding for conditions in code like this:
if ( pvs [ leafnum >> 3 ] & ( 1 << ( leafnum & 7 ) ) ) { } 
Actually, this condition implement simple, beautiful and elegant access to the desired bit in the array (as one can recall, addressing less than one byte is impossible and you can only work with them via bit operations)

Titans that keep the globe spinning

The visible part of the world is cut off in the same fashion: we find the current leaf where the player is located (in QUAKE this is implemented by the Mod_PointInLeaf function) then we get a pointer to visdata for the current leaf (for our convenience, it is linked directly to the leaf in the form of "compressed_vis" pointer) and then stupidly go through all the leaves and branches of the map and check them for being visible from our leaf (this can be seen in the R_MarkLeaves function). As long as some leaves turn out to be visible from the current leaf we assign them a unique number from "r_visframecount" sequence which increases by one every frame. Thus, we emphasize that this leaf is visible when we build the current frame. In the next frame, "r_framecount" is incremented by one and all the leaves are considered invisible again. As one can understand, this is much more convenient and much faster than revisiting all the leaves at the end of each frame and zeroing their "visible" variable. I drew attention to this feature because this mechanism also bothers some and they don’t understand how it works.
The R_RecursiveWorldNode function “walk” along leaves and branches marked this way. It cuts off obviously invisible leaves and accumulate a list of surfaces from visible ones. Of course the first check is done for the equivalence of r_visframecount and visframe for the node in question. Then the branch undergoes frustum pyramid check and if this check fails then we don’t climb further along this branch. Having stumbled upon a leaf, we mark all its surfaces visible the same way, assigning the current r_framecount value to the visframe variable (in the future this will help us to determine quickly whether a certain surface is visible in the current frame). Then, using a simple function, we determine which side we are from the plane of our branch (each branch has its own plane, literally called “plane” in the code) and, again, for now, we just take all surfaces linked to this branch and add them to the drawing chain (so-called “texturechain”), although nobody can actually stop us from drawing them immediately, right there, (in QUAKE1 source code one can see both options) having previously checked these surfaces for clipping with the frustum pyramid, or at least having made sure that the surface faces us.
In QUAKE, each surface has a special flag SURF_PLANEBACK which help us determine the orientation of the surface. But in QUAKE3 there is no such flag anymore, and clipping of invisible surfaces is not as efficient, sending twice as many surfaces for rendering. However, their total number after performing all the checks is not that great. However, whatever one may say, adding this check to Xash3D raised average FPS almost one and half times in comparison to original Half-Life. This is on the matter whether it is beneficial. But we digress.
So after chaining and drawing visible surfaces, we call R_RecursiveWorldNode again but now - for the second of two root branches of BSP tree. Just in case. Because the visible surfaces, too, may well be there. When the recursion ends, the result will either be a whole rendered world, or chains of visible surfaces at least. This is what can actually be sent for rendering with OpenGL or Direct3D, well, if we did not draw our world right in the R_RecursiveWorldNode function of course. Actually this method with minor upgrades successfully used in all three QUAKEs.

A naked man is in a wardrobe because he's waiting for a tram

One of the upgrades is utilization of the so-called areaportals. This is another optimization method coming straight out of QUAKE2. The point of using areaportals is about game logic being able to turn the visibility of an entire sectors on and off at its discretion. Technically, this is achieved as follows: the world is divided into zones similar to the usual partitioning along the BSP tree, however, there can’t be more than 256 of them (later I will explain why) and they are not connected in any way.
Regular visibility is determined just like in QUAKE; however, by installing a special “func_areaportal” entity we can force the compiler to split an area in two. This mechanism operates on approximately the same principle as the algorithm of searching for holes in the map, so you won’t deceive the compiler by putting func_areaportal in a bare field - the compiler will simply ignore it. Although if you make areaportal the size of the cross-section of this field (to the skybox in all directions) in spite of everything the zones will be divided. We can observe this technique in Half-Life 2 where an attempt to return to old places (with cheats for example) shows us disconnected areaportals and a brief transition through the void from one zone to another. Actually, this mechanism helped Half-Life 2 simulate large spaces successfully and still use BSP level structure (I have already said that BSP, its visibility check algorithm to be precise, is not very suitable for open spaces).
So installed areaportal forcibly breaks one zone into two, and the rest of the zoneization is at the discretion of the compiler, which at the same time makes sure not to exceed 256 zones limit, so their sizes can be completely different. Well, I repeat, it depends on the overall size of the map. Our areaportal is connected to some door dividing these two zones. When the door is closed - it turns areaportal off and the zones are separated from each other. Therefore, if the player is not in the cut off zone, then rendering it is not worth it. In QUAKE, we’d have to do a bunch of checks and it’s possible that we could only cut off a fraction of the number of polygons (after all, the door itself is not an obstacle for either visibility check, or even more so, nor it is for frustum). Compare to case in point: one command is issued - and the whole room is excluded from visibility. “Not bad,” you’d say, “but how would the renderer find out? After all, we performed all our operations on the server and the client does not know anything about it.” And here we go back to the question why there can’t be more than 256 zones.
The point is, information about all of zone visibility is, likewise, packaged in bit flags (like PVS) and transmitted to the client in a network message. Dividing 256 bits by 8 makes 32 bytes, which generally isn’t that much. In addition, the tail of this information can be cut off at ease if it contains zeroes only. Though the payback for such an optimization would appear as an extra byte that will have to be transmitted over the network to indicate the actual size of the message about the visibility of our zones. But, in general, this approach justified.

Light_environment traces enter from the back

Source Engine turned out to have a terrible bug which makes the whole areaportal thing nearly meaningless. Numerous problems arise because of it: water breaks down into segments that pop in, well, you should be familiar with all this by now. Areaportal cuts the geometry unpredictably, like an ordinary secant plane, but its whole point is being predictable! Whereas areaportal brushes in Source Engine have absolutely no priority in splitting the map. It should be like this: first, the tree is cut the regular way. And when no suitable planes left, the final secant plane of areaportal is used. This is the only way to cut the sectors correctly.

Modern problems

The second optimization method, as I said, is increased size of the final leaf akin to QUAKE3. It is believed that a video card would draw a certain amount of polygons much faster than the CPU would check whether they are visible. This come from the very concept of visibility check: if visibility check takes longer than direct rendering, then well, to hell with this check. The controversy of this approach is determined by a wide range of video cards present at the hands of the end users, and it is strongly determined by the surging fashion for laptops and netbooks in which a video card is a very conditional and very weak concept (don’t even consider its claimed Shader Model 3 support). Therefore, for desktop gaming machines it would be more efficient to draw more at a time, but for weak video cards of laptops traditional culling will remain more reliable. Even if it is such a simple culling as I described earlier.

Decompression sickness simulator

Although I should also mention the principles of frustum culling, perhaps they are incomprehensible to some. Cutoff by frustum pyramid is actually pure mathematics without any compiler calculations. From the current direction of the player’s gaze, a clipping pyramid is built (the tip of the pyramid – in case someone can’t understand - is oriented towards the player’s point of view and its base is oriented in the direction of player’s view). The angle between the walls of the pyramid can be sharp or blunt - as you probably guessed already, it depends on the player's FOV. In addition, the player can forcefully pull the far wall of the pyramid closer to himself (yes, this is the notorious “MaxRange” parameter in the “worldspawn” menu of the map editor). Of course, OpenGL also builds a similar pyramid for its internal needs when it takes information from the projection matrix but we’re talking local pyramid now. The finished pyramid consists of 4-6 planes (QUAKE uses only 4 planes and trusts OpenGL to independently cut far and near polygons, but if you write your own renderer and intend to support mirrors and portals you will definitely need all six planes). Well, the frustum test itself is an elementary check for a presence of AA-box (AABB, Axis Aligned Bounding Box) in the frustum pyramid. Or speaking more correctly, this is a check for their intersection. Let me remind you that each branch has its own dimensions (a fragment of secant plane bound by neighboring perpendicular secant planes) which are checked for intersection. But unfortunately the frustum test has one fundamental drawback - it cannot cut what is directly in the player’s view. We can adjust the cutoff distance, we can even make that “ear feint” like they do in QFusion where final zFar value is calculated in each frame before rendering and then taken into account in entity clipping, but after all, whatever they say, the value itself was obtained from PVS-information. Therefore, neither of two methods can replace the other but they just complement each other. This should be remembered.

I gotta lay off the pills I'm taking

It seems that we figured out the rendering of the world and now we are moving on smoothly to cutting off moving objects... which are all the visible objects in the world! Even ones that, at te first glance, stand still and aren’t planning to move anywhere. Cause the player moves! From one point he still sees a certain static object, and from another point, of course, he no longer does. This detail should also be considered.
Actually, at the beginning of this article I already spoke in detail about an algorithm of objects’ visibility check: first we find the visible leaf for the player, then we find the visible leaf for the entity and then we check by visdata whether they see each other. I, too, would like to clarify (if someone suddenly does not understand) how each moving entity is given the number of its current visible leaf, i.e. directly for entity’s its own current position, and the leaves themselves are of course static and always in the same place.

Ostrich is such an OP problem solver

So the method described above has two potential problems:
The first problem is that even if A equals B, then, oddly enough, B is far from being always equal A. In other words, entity A can see entity B, but this does not mean that entity B see entity A, and, no, it’s not about one of them “looking” away. So why is this happening? Most often for two reasons:
The first reason is that one of the entities’ ORIGIN sit tight inside the wall and the Mod_PointInLeaf function for it points to the outer “zero” leaf from which EVERYTHING is visible (haven’t any of you ever flown around the map?). Meanwhile, no leaf inside the map can see outer leaf - these two features actually explain an interesting fact of an entire world geometry becoming visible and on the contrary, all objects disappearing when you fly outside the map. In regular mode, similar problems can occur for objects attached to the wall or recessed into the wall. For example, sometimes the sounds of a pressed button or opening door disappear because its current position went beyond the world borders. This phenomenon is fought by interchanging objects A and B or by obtaining alternative points for the position of an object, but all the same, it’s all not very reliable.

But lawyer said that you don't exist

In addition, as I said, there is another problem. It come from the fact that not every entity fits a single leaf. Only the player is so small that he can always be found in one leaf only (well, in the most extreme case - in two leaves on the border of water and air. This phenomenon is fought with various hacks btw), but some giant hentacle or on the contrary, an elevator made as a door entity, can easily occupy 30-40 leaves at a time. An attempt to check one leaf (for example, one where the center of the model is) will inevitably lead to a deplorable result: as soon as the center of an object will be out of the player’s visibility range, the entire object will disappear completely. The most common case is the notorious func_door used as an elevator. There is one in QUAKE on the E1M1. Observe: it travels halfway and then its ORIGIN is outside the map and therefore it must disappear from the player’s field of view. However, it does not go anywhere, right? Let us see in greater detail how this is done.
The simplest idea that comes to one’s mind: since the object occupies several leaves, we have to save them all somewhere in the structure of an object in the code and check them one by one. If at least one of these leaves is visible, then the whole object is visible (for example, it’s very tip). This is exactly what was implemented in QUAKE: a static array for 16 leaves and a simple recursive function SV_FindTouchedLeafs that looks for all the leaves in range hardcoded in "pev->absmins" and "pev->absmax" variables (pev i.e. a Pointer to EntVars_t table). absmins and absmax are recalculated each time SV_LinkEdict (or its more specific case of UTIL_SetOrigin) is called. Hence the quite logical conclusion that a simple change of ORIGIN without recalculating its visible leaf will take the object out of visibility sooner or later even if, surprisingly enough, it’s right in front of the player and the player should technically still be able to see it. Inb4 why one have to call UTIL_SetOrigin and wouldn’t it be easier to just assign new value to the "pev->origin" vector without calling this function. It wouldn’t.
With this method we can solve both former problems perfectly: we can fight the loss of visibility if the object's ORIGIN went beyond the world borders and level the difference of visibility for A->B versus visibility for B->A.

A secret life of monster_tripmine

Actually we’ve yet to encounter another problem, but it does not occur immediately. Remember, we’ve got an array of 16 leaves. But what if it won’t be enough? Thank God there are no beams in QUAKE and no very long elevators made as func_door either. For this exact reason. Because when the array is filled to capacity, the SV_FindTouchedLeafs function just stop and we can only hope that there won’t be that many cases when an object disappear right before our eyes. But in the original QUAKE, such cases may well be. In Half-Life, the situation is even worse - as you can remember there are rays that can reach for half the map, tripmine rays for example. In this case, a situation may occur when we see just the very tip of the ray. For most of these rays, 16 leaves are clearly not enough. Valve tried to remedy the situation by increasing the array to 48 leaves. That helped. On early maps. If you remember, at the very beginning of the game when the player has already got off the trailer, he enters that epic elevator that takes him down. The elevator is made as a door entity and it occupies 48 leaves exactly. Apparently, the final expansion of the array was based after its dimensions. Then the programmers realized that this isn’t really a solution, because no matter how much one would expand the array, it can still be lacking for something. So then they screwed up an alternative method for visibility check: a head branch (headnode) check. In short, this is still the same SV_FindTouchedLeafs but now it is called directly from the place of visibility check and with a subsequent transfer of visdata in there. In general, it is not used very often because it is slower than checking pre-accumulated leaves, that is, it is intended just for such non-standard cases like this one.
Well, and since, I hope, general picture of the clipping mechanism already beginning to take shape in your mind, I will finish the article in just a few words.
On the server, all objects that have already passed the visibility check are added to the network message containing information about visible objects. Thus, on the client, the list of visible entities is already cut off by PVS and we do not have to do this again and therefore a simple frustum check is enough. You ask, "why did we have to cut off invisible objects on the server when we could do this later when we are on the client already?" I reply: yes, we could, but now the objects cut off on the server didn’t get into the network message and saved us some traffic. And since the player still does not see them, what is the point of transferring them to the client just to check them for visibility after? This is a kind of double optimizing :)
© Uncle Mike 2012
submitted by crystallize1 to hammer [link] [comments]

Differences between LISP 1.5 and Common Lisp, Part 1:

[Edit: I didn't mean to put a colon in the title.]
In this post we'll be looking at some of the things that make LISP 1.5 and Common Lisp different. There isn't too much surviving LISP 1.5 code, but some of the code that is still around is interesting and worthy of study.
Here are some conventions used in this post of which you might take notice:
Sources are linked sometimes below, but here is a list of links that were helpful while writing this:
The differences between LISP 1.5 and Common Lisp can be classified into the following groups:
  1. Superficial differences—matters of syntax
  2. Conventional differences—matters of code style and form
  3. Fundamental differences—matters of semantics
  4. Library differences—matters of available functions
This post will go through the first three of these groups in that order. A future post will discuss library differences, except for some functions dealing with character-based input and output, since they are a little world unto their own.
[Originally the library differences were part of this post, but it exceeded the length limit on posts (40000 characters)].

Superficial differences.

LISP 1.5 was used initially on computers that had very limited character sets. The machine on which it ran at MIT, the IBM 7090, used a six-bit, binary-coded decimal encoding for characters, which could theoretically represent up to sixty-four characters. In practice, only fourty-six were widely used. The repertoire of this character set consisted of the twenty-six uppercase letters, the nine digits, the blank character '', and the ten special characters '-', '/', '=', '.', '$', ',', '(', ')', '*', and '+'. You might note the absence of the apostrophe/single quote—there was no shorthand for the quote operator in LISP 1.5 because no sensical character was available.
When the LISP 1.5 system read input from cards, it treated the end of a card not like a blank character (as is done in C, TeX, etc.), but as nothing. Therefore the first character of a symbol's name could be the last character of a card, the remaining characters appearing at the beginning of the next card. Lisp's syntax allowed for the omission of almost all whitespace besides that which was used as delimiters to separate tokens.
List syntax. Lists were contained within parentheses, as is the case in Common Lisp. From the beginning Lisp had the consing dot, which was written as a period in LISP 1.5; the interaction between the period when used as the consing dot and the period when used as the decimal point will be described shortly.
In LISP 1.5, the comma was equivalent to a blank character; both could be used to delimit items within a list. The LISP I Programmer's Manual, p. 24, tells us that
The commas in writing S-expressions may be omitted. This is an accident.
Number syntax. Numbers took one of three forms: fixed-point integers, floating-point numbers, and octal numbers. (Of course octal numbers were just an alternative notation for the fixed-point integers.)
Fixed-point integers were written simply as the decimal representation of the integers, with an optional sign. It isn't explicitly mentioned whether a plus sign is allowed in this case or if only a minus sign is, but floating-point syntax does allow an initial plus sign, so it makes sense that the fixed-point number syntax would as well.
Floating-point numbers had the syntax described by the following context-free grammar, where a term in square brackets indicates that the term is optional:
float: [sign] integer '.' [integer] exponent [sign] integer '.' integer [exponent] exponent: 'E' [sign] digit [digit] integer: digit integer digit digit: one of '0' '1' '2' '3' '4' '5' '6' '7' '8' '9' sign: one of '+' '-' 
This grammar generates things like 100.3 and 1.E5 but not things like .01 or 14E2 or 100.. The manual seems to imply that if you wrote, say, (100. 200), the period would be treated as a consing dot [the result being (cons 100 200)].
Floating-point numbers are limited in absolute value to the interval (2-128, 2128), and eight digits are significant.
Octal numbers are defined by the following grammar:
octal: [sign] octal-digits 'Q' [integer] octal-digits: octal-digit [octal-digit] [octal-digit] [octal-digit] [octal-digit] [octal-digit] [octal-digit] [octal-digit] [octal-digit] [octal-digit] [octal-digit] [octal-digit] octal-digit: one of '0' '1' '2' '3' '4' '5' '6' '7' 
The optional integer following 'Q' is a scale factor, which is a decimal integer representing an exponent with a base of 8. Positive octal numbers behave as one would expect: The value is shifted to the left 3×s bits, where s is the scale factor. Octal was useful on the IBM 7090, since it used thirty-six-bit words; twelve octal digits (which is the maximum allowed in an octal number in LISP 1.5) thus represent a single word in a convenient way that is more compact than binary (but still being easily convertable to and from binary). If the number has a negative sign, then the thirty-sixth bit is logically ored with 1.
The syntax of Common Lisp's numbers is a superset of that of LISP 1.5. The only major difference is in the notation of octal numbers; Common Lisp uses the sharpsign reader macro for that purpose. Because of the somewhat odd semantics of the minus sign in octal numbers in LISP 1.5, it is not necessarily trivial to convert a LISP 1.5 octal number into a Common Lisp expression resulting in the same value.
Symbol syntax. Symbol names can be up to thirty characters in length. While the actual name of a symbol was kept on its property list under the pname indicator and could be any sequence of thirty characters, the syntax accepted by the read program for symbols was limited in a few ways. First, it must not begin with a digit or with either of the characters '+' or '-', and the first two characters cannot be '$'. Otherwise, all the alphanumeric characters, along with the special characters '+', '-', '=', '*', '/', and '$'. The fact that a symbol can't begin with a sign character or a digit has to do with the number syntax; the fact that a symbol can't begin with '$$' has to do with the mechanism by which the LISP 1.5 reader allowed you to write characters that are usually not allowed in symbols, which is described next.
Two dollar signs initiated the reading of what we today might call an "escape sequence". An escape sequence had the form "$$xSx", where x was any character and S was a sequence of up to thirty characters not including x. For example, $$x()x would get the symbol whose name is '()' and would print as '()'. Thus it is similar in purpose to Common Lisp's | syntax. There is a significant difference: It could not be embedded within a symbol, unlike Common Lisp's |. In this respect it is closer to Maclisp's | reader macro (which created a single token) than it is to Common Lisp's multiple escape character. In LISP 1.5, "A$$X()X$" would be read as (1) the symbol A$$X, (2) the empty list, (3) the symbol X.
The following code sets up a $ reader macro so that symbols using the $$ notation will be read in properly, while leaving things like $eof$ alone.
(defun dollar-sign-reader (stream character) (declare (ignore character)) (let ((next (read-char stream t nil t))) (cond ((char= next #\$) (let ((terminator (read-char stream t nil t))) (values (intern (with-output-to-string (name) (loop for c := (read-char stream t nil t) until (char= c terminator) do (write-char c name))))))) (t (unread-char next stream) (with-standard-io-syntax (read (make-concatenated-stream (make-string-input-stream "$") stream) t nil t)))))) (set-macro-character #\$ #'dollar-sign-reader t) 

Conventional differences.

LISP 1.5 is an old programming language. Generally, compared to its contemporaries (such as FORTRANs I–IV), it holds up well to modern standards, but sometimes its age does show. And there were some aspects of LISP 1.5 that might be surprising to programmers familiar only with Common Lisp or a Scheme.
M-expressions. John McCarthy's original concept of Lisp was a language with a syntax like this (from the LISP 1.5 Programmer's Manual, p. 11):
equal[x;y]=[atom[x]→[atom[y]→eq[x;y]; T→F]; equal[car[x];car[Y]]→equal[cdr[x];cdr[y]]; T→F] 
There are several things to note. First is the entirely different phrase structure. It's is an infix language looking much closer to mathematics than the Lisp we know and love. Square brackets are used instead of parentheses, and semicolons are used instead of commas (or blanks). When square brackets do not enclose function arguments (or parameters when to the left of the equals sign), they set up a conditional expression; the arrows separate predicate expressions and consequent expressions.
If that was Lisp, then where do s-expressions come in? Answer: quoting. In the m-expression notation, uppercase strings of characters represent quoted symbols, and parenthesized lists represent quoted lists. Here is an example from page 13 of the manual:
λ[[x;y];cons[car[x];y]][(A B);(C D)] 
As an s-expressions, this would be
((lambda (x y) (cons (car x) y)) '(A B) '(C D)) 
The majority of the code in the manual is presented in m-expression form.
So why did s-expressions stick? There are a number of reasons. The earliest Lisp interpreter was a translation of the program for eval in McCarthy's paper introducing Lisp, which interpreted quoted data; therefore it read code in the form of s-expressions. S-expressions are much easier for a computer to parse than m-expressions, and also more consistent. (Also, the character set mentioned above includes neither square brackets nor a semicolon, let alone a lambda character.) But in publications m-expressions were seen frequently; perhaps the syntax was seen as a kind of "Lisp pseudocode".
Comments. LISP 1.5 had no built-in commenting mechanism. It's easy enough to define a comment operator in the language, but it seemed like nobody felt a need for them.
Interestingly, FORTRAN I had comments. Assembly languages of the time sort of had comments, in that they had a portion of each line/card that was ignored in which you could put any text. FORTRAN was ahead of its time.
(Historical note: The semicolon comment used in Common Lisp comes from Maclisp. Maclisp likely got it from PDP-10 assembly language, which let a semicolon and/or a line break terminate a statement; thus anything following a semicolon is ignored. The convention of octal numbers by default, decimal numbers being indicated by a trailing decimal point, of Maclisp too comes from the assembly language.)
Code formatting. The code in the manual that isn't written using m-expression syntax is generally lacking in meaningful indentation and spacing. Here is an example (p. 49):
(TH1 (LAMBDA (A1 A2 A C) (COND ((NULL A) (TH2 A1 A2 NIL NIL C)) (T (OR (MEMBER (CAR A) C) (COND ((ATOM (CAR A)) (TH1 (COND ((MEMBER (CAR A) A1) A1) (T (CONS (CAR A) A1))) A2 (CDR A) C)) (T (TH1 A1 (COND ((MEMBER (CAR A) A2) A2) (T (CONS (CAR A) A2))) (CDR A) C)))))))) 
Nowadays we might indent it like so:
(TH1 (LAMBDA (A1 A2 A C) (COND ((NULL A) (TH2 A1 A2 NIL NIL C)) (T (OR (MEMBER (CAR A) C) (COND ((ATOM (CAR A)) (TH1 (COND ((MEMBER (CAR A) A1) A1) (T (CONS (CAR A) A1))) A2 (CDR A) C)) (T (TH1 A1 (COND ((MEMBER (CAR A) A2) A2) (T (CONS (CAR A) A2))) (CDR A) C)))))))) 
Part of the lack of formatting stems probably from the primarily punched-card-based programming world of the time; you would see the indented structure only by printing a listing of your code, so there is no need to format the punched cards carefully. LISP 1.5 allowed a very free format, especially when compared to FORTRAN; the consequence is that early LISP 1.5 programs are very difficult to read because of the lack of spacing, while old FORTRAN programs are limited at least to one statement per line.
The close relationship of Lisp and pretty-printing originates in programs developed to produce nicely formatted listings of Lisp code.
Lisp code from the mid-sixties used some peculiar formatting conventions that seem odd today. Here is a quote from Steele and Gabriel's Evolution of Lisp:
This intermediate example is derived from a 1966 coding style:
DEFINE(( (MEMBER (LAMBDA (A X) (COND ((NULL X) F) ((EQ A (CAR X) ) T) (T (MEMBER A (CDR X))) ))) )) 
The design of this style appears to take the name of the function, the arguments, and the very beginning of the COND as an idiom, and hence they are on the same line together. The branches of the COND clause line up, which shows the structure of the cases considered.
This kind of indentation is somewhat reminiscent of the formatting of Algol programs in publications.
Programming style. Old LISP 1.5 programs can seem somewhat primitive. There is heavy use of the prog feature, which is related partially to the programming style that was common at the time and partially to the lack of control structures in LISP 1.5. You could express iteration only by using recursion or by using prog+go; there wasn't a built-in looping facility. There is a library function called for that is something like the early form of Maclisp's do (the later form would be inherited in Common Lisp), but no surviving LISP 1.5 code uses it. [I'm thinking of making another post about converting programs using prog to the more structured forms that Common Lisp supports, if doing so would make the logic of the program clearer. Naturally there is a lot of literature on so called "goto elimination" and doing it automatically, so it would not present any new knowledge, but it would have lots of Lisp examples.]
LISP 1.5 did not have a let construct. You would use either a prog and setq or a lambda:
(let ((x y)) ...) 
is equivalent to
((lambda (x) ...) y) 
Something that stands out immediately when reading LISP 1.5 code is the heavy, heavy use of combinations of car and cdr. This might help (though car and cdr should be left alone when they are used with dotted pairs):
(car x) = (first x) (cdr x) = (rest x) (caar x) = (first (first x)) (cadr x) = (second x) (cdar x) = (rest (first x)) (cddr x) = (rest (rest x)) (caaar x) = (first (first (first x))) (caadr x) = (first (second x)) (cadar x) = (second (first x)) (caddr x) = (third x) (cdaar x) = (rest (first (first x))) (cdadr x) = (rest (second x)) (cddar x) = (rest (rest (first x))) (cdddr x) = (rest (rest (rest x))) 
Here are some higher compositions, even though LISP 1.5 doesn't have them.
(caaaar x) = (first (first (first (first x)))) (caaadr x) = (first (first (second x))) (caadar x) = (first (second (first x))) (caaddr x) = (first (third x)) (cadaar x) = (second (first (first x))) (cadadr x) = (second (second x)) (caddar x) = (third (first x)) (cadddr x) = (fourth x) (cdaaar x) = (rest (first (first (first x)))) (cdaadr x) = (rest (first (second x))) (cdadar x) = (rest (second (first x))) (cdaddr x) = (rest (third x)) (cddaar x) = (rest (rest (first (first x)))) (cddadr x) = (rest (rest (second x))) (cdddar x) = (rest (rest (rest (first x)))) (cddddr x) = (rest (rest (rest (rest x)))) 
Things like defstruct and Flavors were many years away. For a long time, Lisp dialects had lists as the only kind of structured data, and programmers rarely defined functions with meaningful names to access components of data structures that are represented as lists. Part of understanding old Lisp code is figuring out how data structures are built up and what their components signify.
In LISP 1.5, it's fairly common to see nil used where today we'd use (). For example:
(LAMBDA NIL ...) 
instead of
(LAMBDA () ...) 
or (PROG NIL ...)
instead of
(PROG () ...) 
Actually this practice was used in other Lisp dialects as well, although it isn't really seen in newer code.
Identifiers. If you examine the list of all the symbols described in the LISP 1.5 Programmer's Manual, you will notice that none of them differ only in the characters after the sixth character. In other words, it is as if symbol names have only six significant characters, so that abcdef1 and abcdef2 would be considered equal. But it doesn't seem like that was actually the case, since there is no mention of such a limitation in the manual. Another thing of note is that many symbols are six characters or fewer in length.
(A sequence of six characters is nice to store on the hardware on which LISP 1.5 was running. The processor used thirty-six-bit words, and characters were six-bit; therefore six characters fit in a single word. It is conceivable that it might be more efficient to search for names that take only a single word to store than for names that take more than one word to store, but I don't know enough about the computer or implementation of LISP 1.5 to know if that's true.)
Even though the limit on names was thirty characters (the longest symbol names in standard Common Lisp are update-instance-for-different-class and update-instance-for-redefined-class, both thirty-five characters in length), only a few of the LISP 1.5 names are not abbreviated. Things like terpri ("terminate print") and even car and cdr ("contents of adress part of register" and "contents of decrement part of register"), which have stuck around until today, are pretty inscrutable if you don't know what they mean.
Thankfully the modern style is to limit abbreviations. Comparing the names that were introduced in Common Lisp versus those that have survived from LISP 1.5 (see the "Library" section below) shows a clear preference for good naming in Common Lisp, even at the risk of lengthy names. The multiple-value-bind operator could easily have been named mv-bind, but it wasn't.

Fundamental differences.

Truth values. Common Lisp has a single value considered to be false, which happens to be the same as the empty list. It can be represented either by the symbol nil or by (); either of these may be quoted with no difference in meaning. Anything else, when considered as a boolean, is true; however, there is a self-evaluating symbol, t, that traditionally is used as the truth value whenever there is no other more appropriate one to use.
In LISP 1.5, the situation was similar: Just like Common Lisp, nil or the empty list are false and everything else is true. But the symbol nil was used by programmers only as the empty list; another symbol, f, was used as the boolean false. It turns out that f is actually a constant whose value is nil. LISP 1.5 had a truth symbol t, like Common Lisp, but it wasn't self-evaluating. Instead, it was a constant whose permanent value was *t*, which was self-evaluating. The following code will set things up so that the LISP 1.5 constants work properly:
(defconstant *t* t) ; (eq *t* t) is true (defconstant f nil) 
Recall the practice in older Lisp code that was mentioned above of using nil in forms like (lambda nil ...) and (prog nil ...), where today we would probably use (). Perhaps this usage is related to the fact that nil represented an empty list more than it did a false value; or perhaps the fact that it seems so odd to us now is related to the fact that there is even less of a distinction between nil the empty list and nil the false value in Common Lisp (there is no separate f constant).
Function storage. In Common Lisp, when you define a function with defun, that definition gets stored somehow in the global environment. LISP 1.5 stores functions in a much simpler way: A function definition goes on the property list of the symbol naming it. The indicator under which the definition is stored is either expr or fexpr or subr or fsubr. The expr/fexpr indicators were used when the function was interpreted (written in Lisp); the subr/fsubr indicators were used when the function was compiled (or written in machine code). Functions can be referred to based on the property under which their definitions are stored; for example, if a function named f has a definition written in Lisp, we might say that "f is an expr."
When a function is interpreted, its lambda expression is what is stored. When a function is compiled or machine coded, a pointer to its address in memory is what is stored.
The choice between expr and fexpr and between subr and fsubr is based on evaluation. Functions that are exprs and subrs are evaluated normally; for example, an expr is effectively replaced by its lambda expression. But when an fexpr or an fsubr is to be processed, the arguments are not evaluated. Instead they are put in a list. The fexpr or fsubr definition is then passed that list and the current environment. The reason for the latter is so that the arguments can be selectively evaluated using eval (which took a second argument containing the environment in which evaluation is to occur). Here is an example of what the definition of an fexpr might look like, LISP 1.5 style. This function takes any number of arguments and prints them all, returning nil.
(LAMBDA (A E) (PROG () LOOP (PRINT (EVAL (CAR A) E)) (COND ((NULL (CDR A)) (RETURN NIL))) (SETQ A (CDR A)) (GO LOOP))) 
The "f" in "fexpr" and "fsubr" seems to stand for "form", since fexpr and fsubr functions got passed a whole form.
The top level: evalquote. In Common Lisp, the interpreter is usually available interactively in the form of a "Read-Evaluate-Print-Loop", for which a common abbreviation is "REPL". Its structure is exactly as you would expect from that name: Repeatedly read a form, evaluate it (using eval), and print the results. Note that this model is the same as top level file processing, except that the results of only the last form are printed, when it's done.
In LISP 1.5, the top level is not eval, but evalquote. Here is how you could implement evalquote in Common Lisp:
(defun evalquote (operator arguments) (eval (cons operator arguments))) 
LISP 1.5 programs commonly look like this (define takes a list of function definitions):
DEFINE (( (FUNCTION1 (LAMBDA () ...)) (FUNCTION2 (LAMBDA () ...)) ... )) 
which evalquote would process as though it had been written
(DEFINE ( (FUNCTION1 (LAMBDA () ...)) (FUNCTION2 (LAMBDA () ...)) ... )) 
Evaluation, scope, extent. Before further discussion, here the evaluator for LISP 1.5 as presented in Appendix B, translated from m-expressions to approximate Common Lisp syntax. This code won't run as it is, but it should give you an idea of how the LISP 1.5 interpreter worked.
(defun evalquote (function arguments) (if (atom function) (if (or (get function 'fexpr) (get function 'fsubr)) (eval (cons function arguments) nil)) (apply function arguments nil))) (defun apply (function arguments environment) (cond ((null function) nil) ((atom function) (let ((expr (get function 'expr)) (subr (get function 'subr))) (cond (expr (apply expr arguments environment)) (subr ; see below ) (t (apply (cdr (sassoc function environment (lambda () (error "A2")))) arguments environment))))) ((eq (car function 'label)) (apply (caddr function) arguments (cons (cons (cadr function) (caddr function)) arguments))) ((eq (car function) 'funarg) (apply (cadr function) arguments (caddr function))) ((eq (car function) 'lambda) (eval (caddr function) (nconc (pair (cadr function) arguments) environment))) (t (apply (eval function environment) arguments environment)))) (defun eval (form environment) (cond ((null form) nil) ((numberp form) form) ((atom form) (let ((apval (get atom 'apval))) (if apval (car apval) (cdr (sassoc form environment (lambda () (error "A8"))))))) ((eq (car form) 'quote) (cadr form)) ((eq (car form) 'function) (list 'funarg (cadr form) environment)) ((eq (car form) 'cond) (evcon (cdr form) environment)) ((atom (car form)) (let ((expr (get (car form) 'expr)) (fexpr (get (car form) 'fexpr)) (subr (get (car form) 'subr)) (fsubr (get (car form) 'fsubr))) (cond (expr (apply expr (evlis (cdr form) environment) environment)) (fexpr (apply fexpr (list (cdr form) environment) environment)) (subr ; see below ) (fsubr ; see below ) (t (eval (cons (cdr (sassoc (car form) environment (lambda () (error "A9")))) (cdr form)) environment))))) (t (apply (car form) (evlis (cdr form) environment) environment)))) (defun evcon (cond environment) (cond ((null cond) (error "A3")) ((eval (caar cond) environment) (eval (cadar cond) environment)) (t (evcon (cdr cond) environment)))) (defun evlis (list environment) (maplist (lambda (j) (eval (car j) environment)) list)) 
(The definition of evalquote earlier was a simplification to avoid the special case of special operators in it. LISP 1.5's apply can't handle special operators (which is also true of Common Lisp's apply). Hopefully the little white lie can be forgiven.)
There are several things to note about these definitions. First, it should be reiterated that they will not run in Common Lisp, for many reasons. Second, in evcon an error has been corrected; the original says in the consequent of the second branch (effectively)
(eval (cadar environment) environment) 
Now to address the "see below" comments. In the manual it describes the actions of the interpreter as calling a function called spread, which takes the arguments given in a Lisp function call and puts them into the machine registers expected with LISP 1.5's calling convention, and then executes an unconditional branch instruction after updating the value of a variable called $ALIST to the environment passed to eval or to apply. In the case of fsubr, instead of calling spread, since the function will always get two arguments, it places them directly in the registers.
You will note that apply is considered to be a part of the evaluator, while in Common Lisp apply and eval are quite different. Here it takes an environment as its final argument, just like eval. This fact highlights an incredibly important difference between LISP 1.5 and Common Lisp: When a function is executed in LISP 1.5, it is run in the environment of the function calling it. In contrast, Common Lisp creates a new lexical environment whenever a function is called. To exemplify the differences, the following code, if Common Lisp were evaluated like LISP 1.5, would be valid:
(defun weird (a b) (other-weird 5)) (defun other-weird (n) (+ a b n)) 
In Common Lisp, the function weird creates a lexical environment with two variables (the parameters a and b), which have lexical scope and indefinite extent. Since the body of other-weird is not lexically within the form that binds a and b, trying to make reference to those variables is incorrect. You can thwart Common Lisp's lexical scoping by declaring those variables to have indefinite scope:
(defun weird (a b) (declare (special a b)) (other-weird 5)) (defun other-weird (n) (declare (special a b)) (+ a b n)) 
The special declaration tells the implementation that the variables a and b are to have indefinite scope and dynamic extent.
Let's talk now about the funarg branch of apply. The function/funarg device was introduced some time in the sixties in an attempt to solve the scoping problem exemplified by the following problematic definition (using Common Lisp syntax):
(defun testr (x p f u) (cond ((funcall p x) (funcall f x)) ((atom x) (funcall u)) (t (testr (cdr x) p f (lambda () (testr (car x) p f u)))))) 
This function is taken from page 11 of John McCarthy's History of Lisp.
The only problematic part is the (car x) in the lambda in the final branch. The LISP 1.5 evaluator does little more than textual substitution when applying functions; therefore (car x) will refer to whatever x is currently bound whenever the function (lambda expression) is applied, not when it is written.
How do you fix this issue? The solution employed in LISP 1.5 was to capture the environment present when the function expression is written, using the function operator. When the evaluator encounters a form that looks like (function f), it converts it into (funarg f environment), where environment is the current environment during that call to eval. Then when apply gets a funarg form, it applies the function in the environment stored in the funarg form instead of the environment passed to apply.
Something interesting arises as a consequence of how the evaluator works. Common Lisp, as is well known, has two separate name spaces for functions and for variables. If a Common Lisp implementation encounters
(lambda (f x) (f x)) 
the result is not a function applying one of its arguments to its other argument, but rather a function applying a function named f to its second argument. You have to use an operator like funcall or apply to use the functional value of the f parameter. If there is no function named f, then you will get an error. In contrast, LISP 1.5 will eventually find the parameter f and apply its functional value, if there isn't a function named f—but it will check for a function definition first. If a Lisp dialect that has a single name space is called a "Lisp-1", and one that has two name spaces is called a "Lisp-2", then I guess you could call LISP 1.5 a "Lisp-1.5"!
How can we deal with indefinite scope when trying to get LISP 1.5 programs to run in Common Lisp? Well, with any luck it won't matter; ideally the program does not have any references to variables that would be out of scope in Common Lisp. However, if there are such references, there is a fairly simple fix: Add special declarations everywhere. For example, say that we have the following (contrived) program, in which define has been translated into defun forms to make it simpler to deal with:
(defun f (x) (prog (m) (setq m a) (setq a 7) (return (+ m b x)))) (defun g (l) (h (* b a))) (defun h (i) (/ l (f (setq b (setq a i))))) (defun p () (prog (a b i) (setq a 4) (setq b 6) (setq i 3) (return (g (f 10))))) 
The result of calling p should be 10/63. To make it work, add special declarations wherever necessary:
(defun f (x) (declare (special a b)) (prog (m) (setq m a) (setq a 7) (return (+ m b x)))) (defun g (l) (declare (special a b l)) (h (* b a))) (defun h (i) (declare (special a b l i)) (/ l (f (setq b (setq a i))))) (defun p () (prog (a b i) (declare (special a b i)) (setq a 4) (setq b 6) (setq i 3) (return (g (f 10))))) 
Be careful about the placement of the declarations. It is required that the one in p be inside the prog, since that is where the variables are bound; putting it at the beginning (i.e., before the prog) would do nothing because the prog would create new lexical bindings.
This method is not optimal, since it really doesn't help too much with understanding how the code works (although being able to see which variables are free and which are bound, by looking at the declarations, is very helpful). A better way would be to factor out the variables used among several functions (as long as you are sure that it is used in only those functions) and put them in a let. Doing that is more difficult than using global variables, but it leads to code that is easier to reason about. Of course, if a variable is used in a large number of functions, it might well be a better choice to create a global variable with defvar or defparameter.
Not all LISP 1.5 code is as bad as that example!
Join us next time as we look at the LISP 1.5 library. In the future, I think I'll make some posts talking about getting specific programs running. If you see any errors, please let me know.
submitted by kushcomabemybedtime to lisp [link] [comments]

Tales of Grayhall - (Adventure #1, Part 4)

Tales of Grayhall - A Scarlet Heroes Campaign (Adventure #1, Part 4)

 
Character sheet
Adventure #1, Part 3
Adventure #1, Part 2
Adventure #1, Part 1
 
In the previous session, Nikova brawled with a group of drunkards at a brothel, learned his suspect was in some presumably sketchy club and recruited the Chieftain’s eldest son along with his personal group of guards to help him crash the club meeting. I feel this will naturally beget a conflict scene, so that’s what we’ll start with this session.
 
 
{Conflict scene: 1d10 = 1 = Waylay a minion of the foe. Face a Fight instead of a check. Fight Difficulty: 1d8 + (1/2(T), rounded down) = 3 + 1 = 4 = 1d4+T Rabble +1 Veteran = 2 + 1 Rabble +1 Veteran = 3 Rabble and 1 Veteran. Potential foes: 1d8 = 3, 1d10 = 1 = Aged Veteran. That worked out well. Now the Rabble: 1d8 = 4, 1d10 = 10 = Watchful Neighbor. Hmmm. I’ll say the suspect will opt into the scrap and she will be a Rabble since she doesn’t have much fighting experience. I’ll consult oracles regarding club intent as it becomes pertinent.}
With covert haste we approached the brothel. I hope we aren’t too late and the group remains in session. Fyodor’s 4 guards form a semi-circle around him, myself and the rear door of the building with practiced precision. Fyodor and I listen closely. Voices can be heard within. I recognize the gruff voice of the miner and signal my ally accordingly.
He knocks. “Chieftain’s Guard! May we enter?”
Rustling can be heard from within before a response is delivered. “Aye!” We enter, closing the door behind us to hide the semicircle of guards.
The miner is immediately on edge. Recognition in her eyes indicates that I am the source of her unease. She’s not the head of this quadrumvirate. An older Dwarf from the Soldier Clan is the obvious leader. He glances over at the miner, showing his senses are still as keen as ever, and sees how she has set her teeth in a slight grimace. {Is this enough cause for the Veteran to initiate combat? Binary Oracle (likelihood unknown): 1d20 = 2 = No.} They share a silent look and she relents.
“Copperhew. Spearstander. Honor.” He greets us with a customary bow and we do so likewise. His face does not stray from the solemnity carved thereupon, yet his voice escalates ever slightly to a more positive tone. “How can we serve our Chieftain?”
“As you know, our Clan meetings are this day so we have bolstered security. Report of a small group gathering in the area is a reason to be wary, you understand. We are checking the area to ensure safety of the village.”
{NPC Reaction (unfriendly NPC): 2d6 -1 for the risk of significant cost to their actions = 5 - 1 = 4 = Scorn.}
“I am a decorated soldier! How dare you imply high treason! I defend my honor!” He lurches forward as he draws a small dagger and combat begins.
{Veteran: [HD: 2. AC: 5. Hit: +2. Dmg: 1d8. Morale: 9. Skill: +1. Move: 20’] Rabbles (miner, A, and B): [HD: 1. AC: 9. Hit: +1. Dmg: 1d4. Morale: 8. Skill: +1. Move: 30’]}
{Nikova draws his warhammer and attacks the Veteran: 1d20 + Attribute Mod + Atk Bonus + Veteran AC = 15 + 2 + 1 + 5 = 23. Success! Dmg: 1d8 + 2 = 4 + 2 = 6. 2HD damage! Veteran incapacitated! Fray die: 1d8 = 6. 2 more HD damage! Nikova targets the other unknown Dwarves and handily knocks them out.}
The Veteran’s senses were definitely still keen, but his speed betrayed him. In a single movement my warhammer’s pommel struck him atop his crown. He lay unconscious. My attention turns to two other Dwarves approaching with no regard to the fate of their superior combatant. Being mindful to use non-lethal force, I quickly send them to greet the brothel’s floorboards. Fyodor straightens his stance as if to offer truce to the remaining club member; our miner.
{My scene prompt is to waylay a minion of the foe. I feel this means she’s going to attack for sure, but I won’t run that as combat. Nikova has quickly won this exchange so it’s safe to say between him and Fyodor, the miner is not going to change the outcome. I’ll assume Nikova is able to neutralize her attack and she is then taken into the guards’ custody.}
She declines to change her intent and thus charges. Spearstander maneuvers so I can catch the hand with which she is wielding her pickaxe. “No, miner. This is not your fate. I saw that ore and you know I did. You’re a precious gift to your Clan. Speak up and allow us to help you. Our Chieftain’s own son will ensure your protection.”
{NPC reaction (unfriendly NPC): 2d6 -1 for the risk of significant cost to their actions = 5 - 1 = 4 = Scorn.}
“I don’t need your fucking help,” she hissed through gritted teeth. “I’ve been always been on me own and done just fine. Ye’ll get nothing from me, pig.” Recalcitrant words dart from her mouth like stingers, however she drops the pickaxe all the same. She knows she is not within her element with regard to combat and therefore must answer for her treason of attacking Chieftain blood along with her snoozing commerads. Fyodor whistles a signal that brings the guards inside to collect the perpetrators.
“We broke up a threat, for sure, but we didn’t learn anything either. I thank you for bringing this to my attention. I’ll try to find out what the purpose of this little group is once they awaken.”
“Spearstander. Honor.” I bow to Fyodor as he and his guards escort their prisoners. I have more questions than answers, however I cannot shake the feeling I’m closer to the truth much more now than ever.
 
{Successful Conflict Scene! +1 Victory Point for Nikova and -1 Victory Point for the antagonist. These Dwarves attacked important figures to the clan and therefore no Heat was earned as a result of their beatdown. Antagonist VP roll (investigation + action scenes = 3 + 2 = 5): 1d10 = 2 = yes. +1 VP. Victory Point totals: Nikova 7, Antagonist 3. Heat: 1}
 
 
{I feel from here another investigation scene is prudent. Investigation Scene: 1d10 = 3 = Tail an Actor who might have a Clue. On a check failure, face a Fight. Roll Actor: 1d8 = 1, 1d10 = 3, 1d3 (1d6 cut in half, rounded up) = 1 = 1. Commoner. Beautiful young mistress. Actor Relationship: 1d100 = 22 = Crime Culprit. Wow! Here we go!}
I begin to search around the room for any information that may point me in the next direction. My search is shallow at first so I don’t disturb anything beyond that which resulted from the scuffle. I must have been quietly deep in thought because the creak of the door leading further into the brothel seemed as loud as a hawk’s screech! I turn my head in that direction to see only the face of a young female Dwarf. {Memorable Trait: 1d100 = 10 = Asthmatic} Clearly, she was not expecting anyone to be in the room because my presence startled her a great deal. She inhaled a loud gasp which immediately turned into a coughing fit.
Between coughs she choked out, “Ack!... I’m sorry, sir!... I saw guards… remove some patrons and since it was… quiet… I didn’t think anyone was in here! Please excuse me. Carry on.” She darts away before I can offer to fetch her a drink.
As I ponder this scenario a bit longer, the thought strikes me how little time had elapsed between the escort and her attempt to enter the room. I allow my instinct to carry along the suspicion and I leave the building.
For the time being, I wait around the corner of a nearby building with an eye on the brothel. Soon, a cloaked individual peeks out of the rear door. They establish the coast is clear and quickly walk out carrying something. My ears pick up a wheezing and I can assert this figure must be the beautiful girl from moments ago. I do my best to give her some distance and then begin stealthy pursuit.
{This will be an opposed check because Nikova is trying to tail her and she’s trying to go unnoticed to her destination. Young Mistress: 2d8 + 1 for general skill bonus since she’s been able to avoid detection this far in the adventure = 5 + 1 = 6. Nice! Nikova: 2d8 + 1 for Dex + 1 for lightning reflexes = 12 + 1 + 1 = 14. Success!!}
She’s not running full tilt and I have honed in on the sound of her labored breath. Even when she dips out of sight I’m able to track her down quickly. She turns to look back only a few times and thanks to the hood of her cloak, I have just enough time to strafe behind cover before her field of vision can catch me. She eventually reaches a small home not too far from the Chieftain’s fortification. Something is not sitting well with this at all… Not to mention the fact that she was now carrying something which was not in her possession when I first saw her. I suspect she collected it from that room and I’ll wager it has something to do with the ruffians removed from thence. I remain hidden from view of the home’s windows and consider my options.
{Investigation Scene Successful! +1 Victory Point! +1 Clue! Antagonist VP roll (investigation + action scenes = 4 + 2 = 6): 1d10 = 7 = no. Woohoo!! Victory Point Totals: Nicova 8, Antagonist 3. Heat: 1}
 
 
BRUH! These were fun! I’m digging not having to drag out these scenes! Pick the scene, generate something that makes sense, resolve the challenge, then set up for the next scene. Easy peasy! It also helps that my rolls have been fantastic lately, so fate has been quite kind. I feel like the easiest path from here would be to try maybe one more investigation scene and then a conflict scene. If I win them both, I’ll hit 10 Victory points by a margin of 7 and can move immediately to an action scene to try winning the campaign. If I fail the action scene, I still have a spare clue to get straight into a second attempt. Sort of like a failsafe.
Thanks to all of you following along! Take a rest and we will rendezvous again in haste!
Reign
submitted by Consummate_Reign to solorpgplay [link] [comments]

Wall Street Week Ahead for the trading week beginning March 9th, 2020

Good Saturday morning to all of you here on StockMarket. I hope everyone on this sub made out pretty nicely in the market this past week, and is ready for the new trading week and month ahead.
Here is everything you need to know to get you ready for the trading week beginning March 9th, 2020.

Wall Street braces for more market volatility as wild swings become the ‘new normal’ amid coronavirus - (Source)

The S&P 500 has never behaved like this, but Wall Street strategists say get used to it.
Investors just witnessed the equity benchmark swinging up or down 2% for four days straight in the face of the coronavirus panic.
In the index’s history dating back to 1927, this is the first time the S&P 500 had a week of alternating gains and losses of more than 2% from Monday through Thursday, according to Bespoke Investment Group. Daily swings like this over a two-week period were only seen at the peak of the financial crisis and in 2011 when U.S. sovereign debt got its first-ever downgrade, the firm said.
“The message to all investors is that they should expect this volatility to continue. This should be considered the new normal going forward,” said Mike Loewengart, managing director of investment strategy at E-Trade.
The Dow Jones Industrial Average jumped north of 1,000 points twice in the past week, only to erase the quadruple-digit gains in the subsequent sessions. The coronavirus outbreak kept investors on edge as global cases of the infections surpassed 100,000. It’s also spreading rapidly in the U.S. California has declared a state of emergency, while the number of cases in New York reached 33.
“Uncertainty breeds greater market volatility,” Keith Lerner, SunTrust’s chief market strategist, said in a note. “Much is still unknown about how severe and widespread the coronavirus will become. From a market perspective, what we are seeing is uncomfortable but somewhat typical after shock periods.”

More stimulus?

So far, the actions from global central banks and governments in response to the outbreak haven’t triggered a sustainable rebound.
The Federal Reserve’s first emergency rate cut since the financial crisis did little to calm investor anxiety. President Donald Trump on Friday signed a sweeping spending bill with an$8.3 billion packageto aid prevention efforts to produce a vaccine for the deadly disease, but stocks extended their heavy rout that day.
“The market is recognizing the global authorities are responding to this,” said Tom Essaye, founder of the Sevens Report. “If the market begins to worry they are not doing that sufficiently, then I think we are going to go down ugly. It is helping stocks hold up.”
Essaye said any further stimulus from China and a decent-sized fiscal package from Germany would be positive to the market, but he doesn’t expect the moves to create a huge rebound.
The fed funds future market is now pricing in the possibility of the U.S. central bank cutting by 75 basis points at its March 17-18 meeting.

Where is the bottom?

Many on Wall Street expect the market to fall further before recovering as the health crisis unfolds.
Binky Chadha, Deutsche Bank’s chief equity strategist, sees a bottom for the S&P 500 in the second quarter after stocks falling as much as 20% from their recent peak.
“The magnitude of the selloff in the S&P 500 so far has further to go; and in terms of duration, just two weeks in, it is much too early to declare this episode as being done,” Chadha said in a note. “We do view the impacts on macro and earnings growth as being relatively short-lived and the market eventually looking through them.”
Deutsche Bank maintained its year-end target of 3,250 for the S&P 500, which would represent a 10% gain from here and a flat return for 2020.
Strategists are also urging patience during this heightened volatility, cautioning against panic selling.
“It is during times like these that investors need to maintain a longer-term perspective and stick to their investment process rather than making knee-jerk, binary decisions,” Brian Belski, chief investment strategist at BMO Capital Markets, said in a note.

This past week saw the following moves in the S&P:

(CLICK HERE FOR THE FULL S&P TREE MAP FOR THE PAST WEEK!)

Major Indices for this past week:

(CLICK HERE FOR THE MAJOR INDICES FOR THE PAST WEEK!)

Major Futures Markets as of Friday's close:

(CLICK HERE FOR THE MAJOR FUTURES INDICES AS OF FRIDAY!)

Economic Calendar for the Week Ahead:

(CLICK HERE FOR THE FULL ECONOMIC CALENDAR FOR THE WEEK AHEAD!)

Sector Performance WTD, MTD, YTD:

(CLICK HERE FOR FRIDAY'S PERFORMANCE!)
(CLICK HERE FOR THE WEEK-TO-DATE PERFORMANCE!)
(CLICK HERE FOR THE MONTH-TO-DATE PERFORMANCE!)
(CLICK HERE FOR THE 3-MONTH PERFORMANCE!)
(CLICK HERE FOR THE YEAR-TO-DATE PERFORMANCE!)
(CLICK HERE FOR THE 52-WEEK PERFORMANCE!)

Percentage Changes for the Major Indices, WTD, MTD, QTD, YTD as of Friday's close:

(CLICK HERE FOR THE CHART!)

S&P Sectors for the Past Week:

(CLICK HERE FOR THE CHART!)

Major Indices Pullback/Correction Levels as of Friday's close:

(CLICK HERE FOR THE CHART!

Major Indices Rally Levels as of Friday's close:

(CLICK HERE FOR THE CHART!)

Most Anticipated Earnings Releases for this week:

(CLICK HERE FOR THE CHART!)

Here are the upcoming IPO's for this week:

(CLICK HERE FOR THE CHART!)

Friday's Stock Analyst Upgrades & Downgrades:

(CLICK HERE FOR THE CHART LINK #1!)
(CLICK HERE FOR THE CHART LINK #2!)
(CLICK HERE FOR THE CHART LINK #3!)

A "Run of the Mill" Drawdown

If you're like us, you've heard a lot of people reference the recent equity declines as a sign that the market is pricing in some sort of Armageddon in the US economy. While comments like that make for great soundbites, a little perspective is in order. Since the S&P 500's high on February 19th, the S&P 500 is down 12.8%. In the chart below, we show the S&P 500's annual maximum drawdown by year going back to 1928. In the entire history of the index, the median maximum drawdown from a YTD high is 13.05%. In other words, this year's decline is actually less than normal. Perhaps due to the fact that we have only seen one larger-than-average drawdown in the last eight years is why this one feels so bad.
The fact that the current decline has only been inline with the historical norm raises a number of questions. For example, if the market has already priced in the worst-case scenario, going out and adding some equity exposure would be a no brainer. However, if we're only in the midst of a 'normal' drawdown in the equity market as the coronavirus outbreak threatens to put the economy into a recession, one could argue that things for the stock market could get worse before they get better, especially when we know that the market can be prone to over-reaction in both directions. The fact is that nobody knows right now how this entire outbreak will play out. If it really is a black swan, the market definitely has further to fall and now would present a great opportunity to sell more equities. However, if it proves to be temporary and after a quarter or two resolves itself and the economy gets back on the path it was on at the start of the year, then the magnitude of the current decline is probably appropriate. As they say, that's what makes a market!
(CLICK HERE FOR THE CHART!)

Long-Term Treasuries Go Haywire

Take a good luck at today's moves in long-term US Treasury yields, because chances are you won't see moves of this magnitude again soon. Let's start with the yield on the 30-year US Treasury. Today's decline of 29 basis points in the yield will go down as the largest one-day decline in the yield on the 30-year since 2009. For some perspective, there have only been 25 other days since 1977 where the yield saw a larger one day decline.
(CLICK HERE FOR THE CHART!)
That doesn't even tell the whole story, though. As shown in the chart below, every other time the yield saw a sharper one-day decline, the actual yield of the 30-year was much higher, and in most other cases it was much, much higher.
(CLICK HERE FOR THE CHART!)
To show this another way, the percentage change in the yield on the 30-year has never been seen before, and it's not even close. Now, before the chart crime police come calling, we realize showing a percentage change of a percentage is not the most accurate representation, but we wanted to show this for illustrative purposes only.
(CLICK HERE FOR THE CHART!)
Finally, with long-term interest rates plummetting we wanted to provide an update on the performance of the Austrian 100-year bond. That's now back at record highs, begging the question, why is the US not flooding the market with long-term debt?
(CLICK HERE FOR THE CHART!)

It Doesn't Get Much Worse Than This For Crude Oil

Crude oil prices are down close to 10% today in what is shaping up to be the worst day for crude oil since late 2014. That's more than five years.
(CLICK HERE FOR THE CHART!)
Today's decline is pretty much a continuation of what has been a one-way trade for the commodity ever since the US drone strike on Iranian general Soleimani. The last time prices were this low was around Christmas 2018.
(CLICK HERE FOR THE CHART!)
With today's decline, crude oil is now off to its worst start to a year in a generation falling 32%. Since 1984, the only other year that was worse was 1986 when the year started out with a decline of 50% through March 6th. If you're looking for a bright spot, in 1986, prices rose 36% over the remainder of the year. The only other year where crude oil kicked off the year with a 30% decline was in 1991 after the first Iraq war. Over the remainder of that year, prices rose a more modest 5%.
(CLICK HERE FOR THE CHART!)

10-Year Treasury Yield Breaks Below 1%

Despite strong market gains on Wednesday, March 4, 2020, the on-the-run 10-year Treasury yield ended the day below 1% for the first time ever and has posted additional declines in real time, sitting at 0.92% intraday as this blog is being written. “The decline in yields has been remarkable,” said LPL Research Senior Market Strategist Ryan Detrick. “The 10-year Treasury yield has dipped below 1%, and today’s declines are likely to make the recent run lower the largest decline of the cycle.”
As shown in LPL Research’s chart of the day, the current decline in the 10-year Treasury yield without a meaningful reversal (defined as at least 0.75%) is approaching the decline seen in 2011 and 2012 and would need about another two months to be the longest decline in length of time. At the same time, no prior decline has lasted forever and a pattern of declines and increases has been normal.
(CLICK HERE FOR THE CHART!)
What are some things that can push the 10-year Treasury yield lower?
  • A shrinking but still sizable yield advantage over other developed market sovereign debt
  • Added stock volatility if downside risks to economic growth from the coronavirus increase
  • A larger potential premium over shorter-term yields if the Federal Reserve aggressively cuts interest rates
What are some things that can push the 10-year Treasury yield higher?
  • A second half economic rebound acting a catalyst for a Treasury sell-off
  • As yields move lower, investors may increasingly seek more attractive sources of income
  • Any dollar weakness could lead to some selling by international investors
  • Longer maturity Treasuries are looking like an increasingly crowded trade, potentially adding energy to any sell-off
On balance, our view remains that the prospect of an economic rebound over the second half points to the potential for interest rates moving higher. At the same time, we still see some advantage in the potential diversification benefits of intermediate maturity high-quality bonds, especially during periods of market stress. We continue to recommend that suitable investors consider keeping a bond portfolio’s sensitivity to changes in interest rates below that of the benchmark Bloomberg Barclays U.S. Aggregate Bond Index by emphasizing short to intermediate maturity bonds, but do not believe it’s time to pile into very short maturities despite the 10-year Treasury yield sitting at historically low levels.

U.S. Jobs Growth Marches On

While stock markets continue to be extremely volatile as they come to terms with how the coronavirus may affect global growth, the U.S. job market has remained remarkably robust. Continued U.S. jobs data resilience in the face of headwinds from the coronavirus outbreak may be a key factor in prolonging the expansion, given how important the strength of the U.S. consumer has been late into this expansion.
The U.S. Department of Labor today reported that U.S. nonfarm payroll data had a strong showing of 273,000 jobs added in February, topping the expectation of every Bloomberg-surveyed economist, with an additional upward revision of 85,000 additional jobs for December 2019 and January 2020. This has brought the current unemployment rate back to its 50-year low of 3.5%. So far, it appears it’s too soon for any effects of the coronavirus to have been felt in the jobs numbers. (Note: The survey takes place in the middle of each month.)
On Wednesday, ADP released its private payroll data (excluding government jobs), which increased by 183,000 in February, also handily beating market expectations. Most of these jobs were added in the service sector, with 44,000 added in the leisure and hospitality sector, and another 31,000 in trade/transportation/utilities. Both of these areas could be at risk of potential cutbacks if consumers start to avoid eating out or other leisure pursuits due to coronavirus fears.
As shown in the LPL Chart of the Day, payrolls remain strong, and any effects of the virus outbreaks most likely would be felt in coming months.
(CLICK HERE FOR THE CHART!)
“February’s jobs report shows the 113th straight month that the U.S. jobs market has grown,” said LPL Financial Senior Market Strategist Ryan Detrick. “That’s an incredible run and highlights how the U.S. consumer has become key to extending the expansion, especially given setbacks to global growth from the coronavirus outbreak.”
While there is bound to be some drag on future jobs data from the coronavirus-related slowdown, we would anticipate that the effects of this may be transitory. We believe economic fundamentals continue to suggest the possibility of a second-half-of-the–year economic rebound.

Down January & Down February: S&P 500 Posts Full-Year Gain Just 43.75% of Time

The combination of a down January and a down February has come about 17 times, including this year, going back to 1950. Rest of the year and full-year performance has taken a rather sizable hit following the previous 16 occurrences. March through December S&P 500 average performance drops to 2.32% compared to 7.69% in all years. Full-year performance is even worse with S&P 500 average turning to a loss of 4.91% compared to an average gain of 9.14% in all years. All hope for 2020 is not lost as seven of the 16 past down January and down February years did go on to log gains over the last 10 months and full year while six enjoyed double-digit gains from March to December.
(CLICK HERE FOR THE CHART!)

Take Caution After Emergency Rate Cut

Today’s big rally was an encouraging sign that the markets are becoming more comfortable with the public health, monetary and political handling of the situation. But the history of these “emergency” or “surprise” rate cuts by the Fed between meetings suggest some caution remains in order.
The table here shows that these surprise cuts between meetings have really only “worked” once in the past 20+ years. In 1998 when the Fed and the plunge protection team acted swiftly and in a coordinated manner to stave off the fallout from the financial crisis caused by the collapse of the Russian ruble and the highly leveraged Long Term Capital Management hedge fund markets responded well. This was not the case during the extended bear markets of 2001-2002 and 2007-2009.
Bottom line: if this is a short-term impact like the 1998 financial crisis the market should recover sooner rather than later. But if the economic impact of coronavirus virus is prolonged, the market is more likely to languish.
(CLICK HERE FOR THE CHART!)

STOCK MARKET VIDEO: Stock Market Analysis Video for Week Ending March 6th, 2020

(CLICK HERE FOR THE YOUTUBE VIDEO!)

STOCK MARKET VIDEO: ShadowTrader Video Weekly 3.8.20

(CLICK HERE FOR THE YOUTUBE VIDEO!)
Here are the most notable companies (tickers) reporting earnings in this upcoming trading week ahead-
  • $ADBE
  • $DKS
  • $AVGO
  • $THO
  • $ULTA
  • $WORK
  • $DG
  • $SFIX
  • $SOGO
  • $DOCU
  • $INO
  • $CLDR
  • $INSG
  • $SOHU
  • $BTAI
  • $ORCL
  • $HEAR
  • $NVAX
  • $ADDYY
  • $GPS
  • $AKBA
  • $PDD
  • $CYOU
  • $FNV
  • $MTNB
  • $NERV
  • $MTN
  • $BEST
  • $PRTY
  • $NINE
  • $AZUL
  • $UNFI
  • $PRPL
  • $VSLR
  • $KLZE
  • $ZUO
  • $DVAX
  • $EXPR
  • $VRA
  • $AXSM
  • $CDMO
  • $CASY
(CLICK HERE FOR NEXT WEEK'S MOST NOTABLE EARNINGS RELEASES!)
(CLICK HERE FOR NEXT WEEK'S HIGHEST VOLATILITY EARNINGS RELEASES!)
Below are some of the notable companies coming out with earnings releases this upcoming trading week ahead which includes the date/time of release & consensus estimates courtesy of Earnings Whispers:

Monday 3.9.20 Before Market Open:

(CLICK HERE FOR MONDAY'S PRE-MARKET EARNINGS TIME & ESTIMATES!)

Monday 3.9.20 After Market Close:

(CLICK HERE FOR MONDAY'S AFTER-MARKET EARNINGS TIME & ESTIMATES!)

Tuesday 3.10.20 Before Market Open:

(CLICK HERE FOR TUESDAY'S PRE-MARKET EARNINGS TIME & ESTIMATES!)

Tuesday 3.10.20 After Market Close:

(CLICK HERE FOR TUESDAY'S AFTER-MARKET EARNINGS TIME & ESTIMATES!)

Wednesday 3.11.20 Before Market Open:

(CLICK HERE FOR WEDNESDAY'S PRE-MARKET EARNINGS TIME & ESTIMATES!)

Wednesday 3.11.20 After Market Close:

(CLICK HERE FOR WEDNESDAY'S AFTER-MARKET EARNINGS TIME & ESTIMATES!)

Thursday 3.12.20 Before Market Open:

(CLICK HERE FOR THURSDAY'S PRE-MARKET EARNINGS TIME & ESTIMATES!)

Thursday 3.12.20 After Market Close:

(CLICK HERE FOR THURSDAY'S AFTER-MARKET EARNINGS TIME & ESTIMATES!)

Friday 3.13.20 Before Market Open:

(CLICK HERE FOR FRIDAY'S PRE-MARKET EARNINGS TIME & ESTIMATES!)

Friday 3.13.20 After Market Close:

([CLICK HERE FOR FRIDAY'S AFTER-MARKET EARNINGS TIME & ESTIMATES!]())
NONE.

Adobe Inc. $336.77

Adobe Inc. (ADBE) is confirmed to report earnings at approximately 4:05 PM ET on Thursday, March 12, 2020. The consensus earnings estimate is $2.23 per share on revenue of $3.04 billion and the Earnings Whisper ® number is $2.29 per share. Investor sentiment going into the company's earnings release has 81% expecting an earnings beat The company's guidance was for earnings of approximately $2.23 per share. Consensus estimates are for year-over-year earnings growth of 29.65% with revenue increasing by 16.88%. Short interest has decreased by 38.4% since the company's last earnings release while the stock has drifted higher by 7.2% from its open following the earnings release to be 10.9% above its 200 day moving average of $303.70. Overall earnings estimates have been revised higher since the company's last earnings release. On Monday, February 24, 2020 there was some notable buying of 1,109 contracts of the $400.00 call expiring on Friday, March 20, 2020. Option traders are pricing in a 9.3% move on earnings and the stock has averaged a 4.1% move in recent quarters.

(CLICK HERE FOR THE CHART!)

DICK'S Sporting Goods, Inc. $34.98

DICK'S Sporting Goods, Inc. (DKS) is confirmed to report earnings at approximately 7:30 AM ET on Tuesday, March 10, 2020. The consensus earnings estimate is $1.23 per share on revenue of $2.56 billion and the Earnings Whisper ® number is $1.28 per share. Investor sentiment going into the company's earnings release has 57% expecting an earnings beat. Consensus estimates are for year-over-year earnings growth of 14.95% with revenue increasing by 2.73%. Short interest has decreased by 29.1% since the company's last earnings release while the stock has drifted lower by 20.3% from its open following the earnings release to be 12.0% below its 200 day moving average of $39.75. Overall earnings estimates have been revised higher since the company's last earnings release. On Wednesday, February 26, 2020 there was some notable buying of 848 contracts of the $39.00 put expiring on Friday, March 20, 2020. Option traders are pricing in a 14.4% move on earnings and the stock has averaged a 7.3% move in recent quarters.

(CLICK HERE FOR THE CHART!)

Broadcom Limited $269.45

Broadcom Limited (AVGO) is confirmed to report earnings at approximately 4:15 PM ET on Thursday, March 12, 2020. The consensus earnings estimate is $5.34 per share on revenue of $5.93 billion and the Earnings Whisper ® number is $5.45 per share. Investor sentiment going into the company's earnings release has 83% expecting an earnings beat. Consensus estimates are for earnings to decline year-over-year by 5.65% with revenue increasing by 2.44%. Short interest has decreased by 15.6% since the company's last earnings release while the stock has drifted lower by 15.3% from its open following the earnings release to be 7.7% below its 200 day moving average of $291.95. Overall earnings estimates have been revised lower since the company's last earnings release. On Tuesday, February 25, 2020 there was some notable buying of 1,197 contracts of the $260.00 put expiring on Friday, April 17, 2020. Option traders are pricing in a 11.1% move on earnings and the stock has averaged a 4.9% move in recent quarters.

(CLICK HERE FOR THE CHART!)

Thor Industries, Inc. $70.04

Thor Industries, Inc. (THO) is confirmed to report earnings at approximately 6:45 AM ET on Monday, March 9, 2020. The consensus earnings estimate is $0.76 per share on revenue of $1.79 billion and the Earnings Whisper ® number is $0.84 per share. Investor sentiment going into the company's earnings release has 62% expecting an earnings beat. Consensus estimates are for year-over-year earnings growth of 16.92% with revenue increasing by 38.70%. Short interest has decreased by 12.9% since the company's last earnings release while the stock has drifted higher by 5.4% from its open following the earnings release to be 12.0% above its 200 day moving average of $62.53. Overall earnings estimates have been revised lower since the company's last earnings release. Option traders are pricing in a 6.3% move on earnings and the stock has averaged a 8.1% move in recent quarters.

(CLICK HERE FOR THE CHART!)

ULTA Beauty $256.58

ULTA Beauty (ULTA) is confirmed to report earnings at approximately 4:00 PM ET on Thursday, March 12, 2020. The consensus earnings estimate is $3.71 per share on revenue of $2.29 billion and the Earnings Whisper ® number is $3.75 per share. Investor sentiment going into the company's earnings release has 73% expecting an earnings beat. Consensus estimates are for year-over-year earnings growth of 2.77% with revenue increasing by 7.78%. Short interest has increased by 8.7% since the company's last earnings release while the stock has drifted lower by 0.1% from its open following the earnings release to be 9.5% below its 200 day moving average of $283.43. Overall earnings estimates have been revised lower since the company's last earnings release. Option traders are pricing in a 15.3% move on earnings and the stock has averaged a 11.7% move in recent quarters.

(CLICK HERE FOR THE CHART!)

Slack Technologies, Inc. $26.42

Slack Technologies, Inc. (WORK) is confirmed to report earnings at approximately 4:15 PM ET on Thursday, March 12, 2020. The consensus estimate is for a loss of $0.06 per share on revenue of $173.06 million and the Earnings Whisper ® number is ($0.04) per share. Investor sentiment going into the company's earnings release has 67% expecting an earnings beat The company's guidance was for a loss of $0.07 to $0.06 per share on revenue of $172.00 million to $174.00 million. Short interest has increased by 1.2% since the company's last earnings release while the stock has drifted higher by 19.0% from its open following the earnings release. Overall earnings estimates have been revised higher since the company's last earnings release. The stock has averaged a 4.3% move on earnings in recent quarters.

(CLICK HERE FOR THE CHART!)

Dollar General Corporation $158.38

Dollar General Corporation (DG) is confirmed to report earnings at approximately 6:55 AM ET on Thursday, March 12, 2020. The consensus earnings estimate is $2.02 per share on revenue of $7.15 billion and the Earnings Whisper ® number is $2.05 per share. Investor sentiment going into the company's earnings release has 76% expecting an earnings beat. Consensus estimates are for year-over-year earnings growth of 9.78% with revenue increasing by 7.52%. Short interest has increased by 16.2% since the company's last earnings release while the stock has drifted higher by 1.8% from its open following the earnings release to be 5.7% above its 200 day moving average of $149.88. Overall earnings estimates have been revised higher since the company's last earnings release. On Friday, February 28, 2020 there was some notable buying of 1,013 contracts of the $182.50 call expiring on Friday, March 20, 2020. Option traders are pricing in a 9.2% move on earnings and the stock has averaged a 5.7% move in recent quarters.

(CLICK HERE FOR THE CHART!)

Stitch Fix, Inc. $22.78

Stitch Fix, Inc. (SFIX) is confirmed to report earnings at approximately 4:05 PM ET on Monday, March 9, 2020. The consensus earnings estimate is $0.06 per share on revenue of $452.96 million and the Earnings Whisper ® number is $0.09 per share. Investor sentiment going into the company's earnings release has 83% expecting an earnings beat The company's guidance was for revenue of $447.00 million to $455.00 million. Consensus estimates are for earnings to decline year-over-year by 50.00% with revenue increasing by 22.33%. Short interest has decreased by 4.6% since the company's last earnings release while the stock has drifted lower by 16.1% from its open following the earnings release to be 5.1% below its 200 day moving average of $24.01. Overall earnings estimates have been revised higher since the company's last earnings release. On Wednesday, February 19, 2020 there was some notable buying of 4,026 contracts of the $35.00 call expiring on Friday, June 19, 2020. Option traders are pricing in a 28.0% move on earnings and the stock has averaged a 15.2% move in recent quarters.

(CLICK HERE FOR THE CHART!)

Sogou Inc. $3.85

Sogou Inc. (SOGO) is confirmed to report earnings at approximately 4:00 AM ET on Monday, March 9, 2020. The consensus earnings estimate is $0.09 per share on revenue of $303.08 million and the Earnings Whisper ® number is $0.10 per share. Investor sentiment going into the company's earnings release has 58% expecting an earnings beat The company's guidance was for revenue of $290.00 million to $310.00 million. Consensus estimates are for year-over-year earnings growth of 28.57% with revenue increasing by 1.78%. Short interest has increased by 6.6% since the company's last earnings release while the stock has drifted lower by 27.8% from its open following the earnings release to be 15.7% below its 200 day moving average of $4.57. Overall earnings estimates have been revised lower since the company's last earnings release. The stock has averaged a 3.8% move on earnings in recent quarters.

(CLICK HERE FOR THE CHART!)

DocuSign $84.02

DocuSign (DOCU) is confirmed to report earnings at approximately 4:05 PM ET on Thursday, March 12, 2020. The consensus earnings estimate is $0.05 per share on revenue of $267.44 million and the Earnings Whisper ® number is $0.08 per share. Investor sentiment going into the company's earnings release has 81% expecting an earnings beat The company's guidance was for revenue of $263.00 million to $267.00 million. Consensus estimates are for year-over-year earnings growth of 600.00% with revenue increasing by 33.90%. Short interest has decreased by 37.7% since the company's last earnings release while the stock has drifted higher by 12.1% from its open following the earnings release to be 31.9% above its 200 day moving average of $63.71. Overall earnings estimates have been revised higher since the company's last earnings release. On Wednesday, March 4, 2020 there was some notable buying of 1,698 contracts of the $87.50 call expiring on Friday, March 20, 2020. Option traders are pricing in a 8.5% move on earnings and the stock has averaged a 10.0% move in recent quarters.

(CLICK HERE FOR THE CHART!)

DISCUSS!

What are you all watching for in this upcoming trading week?
I hope you all have a wonderful weekend and a great trading week ahead StockMarket.
submitted by bigbear0083 to StockMarket [link] [comments]

MACD Indicator Signals Strategy to Trade Ticks Basic Intro to Nadex Indicators A to Z Part 2 THE TRUTH ABOUT BINARY OPTIONS Basic Intro to Nadex Indicators A to Z Part 1 Binary Options Indicator - Best Binary Indicators For MT4?

In binary options, expiration time determine the moment when the trade expires. Traders have to predict the right asset price within a fixed time. Depending on the asset, different brokers may offer different expiry time. It is the most important terminology of binary options trades. Moreover, the expiry time can be less as 60 seconds. BONUS: Trading with multiple indicators. One big caveat to all of the above. Trading from a single indicator will not make you rich. Each indicator has it’s own limitations and will not be RSI Binary Options Strategy is based on Relative Strength Index oscillator and several combinations of other technical indicators. That’s a multi-purpose trading system, working effectively with a wide variety of asset classes in sideways ranges, as well as in strong trends. Technical indicators suitable for binary options trading should incorporate the above factors. One can take a binary option position based on spotting continued momentum or trend reversal patterns Below is a video explaining how to trade binary options on the platform of a leading provider: Strategies. You can trade binary options without technical indicators and rely on the news. The benefit of the news is that it’s relatively straightforward to understand and use. Definition, Causes, Examples, Remedies.

[index] [9810] [2395] [24367] [19206] [2743] [1668] [7659] [20082] [26464] [21775]

MACD Indicator Signals Strategy to Trade Ticks

Using Leading Indicators for Trading OTM Binaries - Duration: 55:12. Nadex 7,322 views. ... How to Trade Out of the Money Binary Options with Nadex - Duration: 1:09:50. ApexInvesting 6,090 views. All about Trading in Forex and Binary Option Marked. INFO PLUS INDICATOR FOR MT4 & MT5 This indicator is a part of my algorithem as it shows the most useful informations on the chart. Are binary options a good idea? If you're thinking about trading binary options, watch this video first. Let's go through the truth about binary options. The binary options signals are sent out by means of electronic data across the internet thus it is important for receivers to check their e-mail and trading software regularly. The binary options ... Using Leading Indicators for Trading OTM Binaries ... project where price will move in the future and allows traders to decrease their risk by utilizing Out of the Money binary options. ...

Flag Counter