• Skip to main content

Search

Just another WordPress site

Weight loss industry

New Developments and Interventions in TB Diagnosis & Treatment

March 24, 2023 by health.economictimes.indiatimes.com Leave a Comment

Prof. Ashok Rattan

Tuberculosis (TB) is a disease of great antiquity. Evidence of human infections has been found in mummies dating 9000 years back. Initially, it was thought to be hereditary, but it’s microbiological origin was first described on 24 th March 1882 in Germany by Robert Koch, a general physician. With the discovery of Streptomycin, it was hoped that TB would be wiped out. Instead, TB now has become a tremendous public health problem, especially in developing countries. More people have died from TB (1 billion) than any other disease, thus becoming a public health emergency.

In 2015, World Health Organization got all heads of state to agree that the time had come to shift focus from control of TB to eradication. Several advancements have occurred since the new emphasis on eliminating TB as a public health problem by 2035 with political will and matching funds. WHO has released a new set of consolidated guidelines. Covid-19 has led to a setback, but eradication efforts are back on track.

Though TB can affect any part of the body, pulmonary TB is the most typical manifestation of the disease since it is transmitted through the respiratory route. The gastrointestinal route from infected cows had been blocked by our ancestors’ habit of boiling milk before drinking, later by pasteurization in the west. Cough, low-grade fever, night sweats, and weight loss are the four cardinal symptoms of TB; anyone experiencing these for more than two weeks should seek medical attention. Recently artificial intelligence software has been developed to read digital chest X-Ray and detect hot spots in the lungs, making them suitable for mass active case finding of TB in the community.

People who are either symptom positive or have hot spots in AI read digital chest X Rays should be confirmed to suffer from TB by examining their sputum. A spot sample should always be collected. An early morning sputum sample, representing six to eight hours of secretion collection in the bronchial tree, might also be tested in the early stages. WHO recommends that the initial method for confirmation of TB should be molecular (which would detect between 16 and 131 bacteria in the sample) rather than AFB (Acid Fast Bacilli ) smear examination, which needs at least 10,000 bacteria to be present in the sample before it can be detected by smear microscopy. The consequence of using smear microscopy, which is easy to perform and cheap, is that it would take at least six months before the bacterial load crosses the detection level. Still, since 3 bacteria can transmit infection, the person would infect 5 to 25 close contacts before being put on treatment.

The government has made molecular methods, Cartridge based nucleic acid amplification test (CBNAAT), or Truenat free of cost in every part of the country. Clinton Foundation, through IPAC, has made these tests available in private laboratories at fixed prices. CBNAAT, also called Gene Xpert , detects both Mycobacterium tuberculosis complex (MTBC) and reports on susceptibility to Rifampicin, which has been used as a surrogate marker of Multidrug Resistance (MDR) TB, indicating both INH and Rif resistance. Results take only two hours of machine time, so reports should be available for decision-making by the next day. Drug Susceptible TB (DS TB) would be treated with four drugs for two months, followed by two for four. All drugs are made available free of cost to the patient.

However, the long duration of treatment has led many persons to discontinue the treatment midway, which allowed the damaged bacteria not only to recover and continue the infection but also to become drug resistant. In the past, clinicians used a number of regimens for varying lengths of time, leading to MTBC becoming resistant to many anti-TB drugs. If the MTBC resists INH and Rif, it is called MDR TB. If, in addition, it is resistant to fluoroquinolones, aminoglycosides were called XDR TB (extremely drug resistant). It, therefore, came as a new ray of hope when reports from South Africa emerged that using three all oral drugs, Bedaquiline , Linezolid, and Protionamide, researchers could cure all XDR TB cases after six months of treatment. Subsequently, analysis indicated that injectable aminoglycosides were not contributing much to recovery but at least 18 drugs had activity against resistant MTBC.

Many attempts have been made to shorten the six months of treatment which is standard for all DS TB cases. A study indicated that is Rifampicin was changed with Rifapentine and Ethambutol changed with Moxifloxacin, a four month treatment was not inferior to the standard six months treatment. Recently, it has been reported that eight weeks of five drugs, Bedaquiline, Linezolid, INH, Pyrazinamide and Ethambutol was as effective as 24 months treatment, leading to better compliance.

It has also emerged that assuming Rifampicin as a surrogate marker for MDR TB was not based on facts and that INH Monoresistance (IMR) also occurs and in fact if Gene Xpert indicates Rif susceptibility but if there in IMR, for four months during the continuation phase, we would be giving only rifampicin monotherapy to these patients and would then encourage development of MDR TB. WHO recommends adding Fluoroquinolone to IMR cases and using both Gene Xpert followed by Line Probe Assay, this two test strategy usually leads to unacceptable delay in initiating the correct evidence based regimen. With the availability of next generation molecular tests like BD Max which detects not only MTBC but also reports on both INH and Rif susceptibility, rapid evidence based management of DS TB should now be possible, without any loss of time.

With as many as 18 drugs available to chose from for treatment of MDR TB cases, WHO has classified them into three classes. Class A consists of the most active and includes moxifloxacin and Levofloxacin besides Bedaquiline and Linezolid. Class B includes Cycloserine and Clofazimine, while all others are included in class c. Ideally, a MDR TB cases should be treated with at least four drugs to which the MTBC is sensitive.

As resistance in MTBC is caused by mutation in the target gene and there are 18 drugs from which four need to be selected based on its susceptibility, performing phenotypic MIC in both difficult and costly. The development and availability of targeted Next Generation Sequencing has made it possible to simultaneously investigate all 32 genes which are targets of these 18 drugs and to report on all of them so that the individualized treatment for each patient could be based on evidence, leading to better compliance and response.

Besides treatment of active cases, persons with TB infection in which a person harbors’ latent bacteria leading to no obvious symptoms and no transmission of illness also need to be given prophylactic treatment to prevent progression onto active disease. Risk groups who would benefit most from this TB preventive Treatment (TPT) have been identified. Mantoux skin test which gave many false positive results, has now been replaced by a new skin test called Cy TB. This consists of two MTBC-specific recombinant antigens. Either TBI can be detected by a positive reaction to this new skin test antigen, or blood can be taken and investigated in the laboratory by an interferon-gamma release assay. Treatment must be taken for three to 9 months, and various regimens have been proposed. The best appears to be one tablet of INH and rifapentine every week for 12 weeks.

The government of India has set up a portal called Nikshey in which anyone diagnosing or treating TB cases needs to feed all information. Once the information is uploaded, the patient will get nutritional support funds for the duration of the treatment. The reporting doctor would receive the incentive for both initial knowledge of the case and subsequently after treatment.

It is clear that if we hope to eradicate this ancient disease, many people will have to do a number of recommended activities actively. Together it is possible to Prevent TB to End TB.

Prof. Ashok Rattan, Chairman Medical Committee & Quality, Redcliffe Labs . Former Laboratory Director, CAREC, PAHO/WHO.

(DISCLAIMER: The views expressed are solely of the author and ETHealthworld does not necessarily subscribe to it. ETHealthworld.com shall not be responsible for any damage caused to any person / organisation directly or indirectly.)

Filed Under: Industry gene xpert, bedaquiline, redcliffe labs, fast bacilli, World TB day, features, rm13 new development, new development what, which guy was the head honcho leader of the country when china created the new development bank, sierramas new development, ratanga new development, fourways new development, randfontein new development, clayville houses new development, clayville new development houses, clayville new development houses for sale

Surge in eating disorders spurs state legislative action

March 23, 2023 by www.chron.com Leave a Comment

This is a carousel. Use Next and Previous buttons to navigate

11

DENVER (AP) — Stranded at home amid pandemic lockdowns in spring 2020, Emma Warford stumbled down a social media rabbit hole in her quest to get in shape. Viral 28-day fitness challenges. YouTubers promising “hourglass abs.” Diet videos where slim-stomached influencers peddled calorie-tracking apps.

Warford, then a 15-year-old starting volleyball player, bought a food scale and began replacing meals with energy drinks hawked by social media stars.

Soon, her calorie cutting became a compulsion. The thought of eating cake for her 16th birthday induced severe anxiety. By season’s end, she began volleyball games benched, too feeble to start. A year into the pandemic, her heart rate slowed and she was rushed to the hospital.

Stories like Warford’s are why lawmakers in Colorado, California, Texas, New York and elsewhere are taking big, legislative swings at the eating disorder crisis. On Thursday, Colorado lawmakers advanced a bill that would create a state Office of Disordered Eating Prevention, intended in part to patch holes in care, to fund research and to raise awareness.

The bill passed committee by a 6-3 vote with Republicans demurring, partly concerned with the creation of a new government office and skeptical of its efficacy.

Warford, who’s now in recovery after two years of treatment, is among nearly 30 million Americans — about the population of Texas — who will struggle with an eating disorder in their lifetime. Every year over 10,000 die from an eating disorder, according to data cited by the National Association of Anorexia Nervosa and Associated Disorders.

Proposals across the U.S. include restricting social media algorithms from promoting potentially harmful content; prohibiting the sale of weight loss pills to minors; and adding eating disorder prevention to middle and high school curriculums.

The slew of legislation follows a spike in eating disorder cases as pandemic lockdowns pushed youth into long bouts of isolation. Hospital beds filled and waiting lists swelled as many struggled to find treatment for an illness that already had few care options. In Colorado, only one hospital was equipped to offer inpatient care for Warford, who was diagnosed with anorexia.

Anorexia typically involves restrictive eating habits and can cause abnormally low blood pressure and organ damage. Binge eating disorder is a compulsion in the other direction. Still, having an eating disorder does not invariably mean someone is overweight or underweight — and that’s left many who suffer with the mental illness to go undiagnosed, experts say.

Colorado’s bill creates a state office that is broadly charged with, in part, closing gaps in treatments, offering research grants, and working to educate students, teachers and parents. Bills in New York and Texas similarly seek to educate students on mental illnesses including eating disorders.

Katrina Velasquez, chief policy officer of the national Eating Disorder Coalition, said these policies will give students the tools to catch signs of disordered eating habits in themselves or their peers early — potentially giving them a critical head start in treatment.

Colorado is also taking a swing at axing the the use of body mass index, or BMI, even though it remains the industry standard. The measurement is used often to determine the level of care required for those with eating disorders, but mental illness is not invariably linked to body weight or BMI, said Claire Engels, program coordinator for the Eating Disorder Foundation. That means that those who fall outside of the BMI prescription are often denied care, or kicked out of treatment prematurely.

“Eating disorders are not necessarily about food. It’s about mental illness, anxiety, depression, trauma” and control, Engels said.

When Riley Judd was around 12, she saw a photo of herself on vacation in a bathing suit. Turning to her mom she said, “I look like a whale.” It was the first time she remembered a voice in her head ruthlessly comparing her to the beaming, thin celebrities on the cover of Seventeen Magazine and Girls’ Life. “If I lose all this weight, people will like me,” the voice muttered to her. She attempted suicide at age 13.

“It was an all-consuming voice,” said Judd, now a legislative intern and student at the University of Denver.

California lawmakers are targeting social media with a bill prohibiting social media platforms from having algorithms or features that expose children to diet products or lead them to develop an eating disorder. Platforms that violate the legislation could be fined $250,000.

Another California bill would expand the list of approved facilities that can provide inpatient treatment to people with eating disorders — similar to a Texas proposal that would expand Medicaid coverage for mental health services, including eating disorders.

Texas state Rep. Shelby Slawson, a Republican, also introduced a bill to protect minors who use digital platforms.

Cathy Johnson, a school counselor of 24 years who testified on the Texas proposal, said “one of the biggest issues” she has seen from social media is an increase in eating disorders.

“We have kids having panic attacks in school because their anxiety is so high, they are comparing themselves, they think they are going to be like one of the influencers on TikTok,” Johnson said. ___ Associated Press reporters Sophie Austin contributed from California, Acacia Coronado contributed from Texas, and Michael Hill contributed from New York. Jesse Bedayn is a corps member for the Associated Press/Report for America Statehouse News Initiative. Report for America is a nonprofit national service program that places journalists in local newsrooms to report on undercovered issues.

Filed Under: Uncategorized Emma Warford, Claire Engels, Cathy Johnson, Katrina Velasquez, Jesse Bedayn, Shelby Slawson, Riley Judd, Republicans, Michael Hill, Sophie Austin, Texas, ..., eating disorder binge eating, eating disorder healthy eating, is binge eating an eating disorder, is body dysmorphic disorder an eating disorder, body dysmorphic disorder and eating disorders, topiramate reduces nocturnal eating in sleep-related eating disorder, eating disorder can't stop eating, is drunkorexia an eating disorder substance use disorder or both, disordered eating vs eating disorder, deficiencies in what have been linked to anxiety mood disorders eating disorders and insomnia

Interest rates expected to rise for 11th time in less than two years

March 23, 2023 by metro.co.uk Leave a Comment

The Bank of England is expected to raise the interest rate for the 11th time in less than 18 months after Wednesday’s surprise jump in inflation.

It is due to announce its latest decision at noon as it tries to reconcile the UK’s weak economic outlook and an international banking crisis.

The BoE faces a difficult balancing act, weighing up the need to rein in inflation with the worries over banking woes and the possibility they may start to clamp down on lending.

Last month the raise in interest rates meant it was the highest in 14 years.

Yesterday’s data – showing inflation rising to 10.4% in February rather than continuing its descent – immediately turned today’s announcement into an almost one-way bet on a quarter-percentage-point increase in Bank Rate.

Financial markets widely expect a 0.25 percentage point hike to 4.25%.

It comes after inflation hit a 41-year high at 11.1% in October last year.

As recently as Tuesday, investors were split almost 50-50 on whether the BoE would leave Bank Rate unchanged for the first time since November 2021.

Bets earlier this week on the BoE halting its run of rate hikes were further bolstered by the rescue of Credit Suisse and the collapse of Silicon Valley Bank which showed how some global banks were struggling to adjust to higher borrowing costs.

Any increase in Bank Rate would be part of a series of measures to reduce inflation, but would inevitably increase pressure on many people, particularly mortgage holders, already being squeezed by the cost-of-living crisis.

The European Central Bank also raised its three main interest rates by 50 basis points last week despite financial market turmoil engulfing Credit Suisse and the collapse of Silicon Valley Bank.

While some of the inflation rise can be blamed on one-off factors such as vegetable shortages, the underlying measures that the Bank of England tracks have also increased.

It was the first major central bank to start raising interest rates in December 2021.

NG economist James Smith said he expected any rate hike was likely to prove the last in the Bank of England’s run.

Is enough being done to support people through the cost of living crisis? Have your say now

View the conversation

He said: ‘Assuming the broader inflation data continues to point to an easing in pipeline pressures, then we suspect the committee will be comfortable with pausing by the time of the next meeting in May.’

Shares on Wall Street tumbled sharply overnight on the Fed’s decision and its comments suggesting it does not expect to cut rates anytime soon, highlighting the fragility of stock market confidence.

After three days of bounceback gains, the S&P 500 fell 1.7%, while the Dow Jones Industrial Average lost 1.6% and the Nasdaq composite dropped 1.6%.

Some of the sharpest drops again came from the banking industry, where investors are worried about the possibility of more banks failing if customers pull out their money all at once.

Craig Erlam, a senior market analyst for Oanda, said: ‘Whatever flexibility the Bank of England may have thought it would have on Thursday was wiped out by Wednesday morning’s inflation data.’

He added there is ‘nothing that would justify a pause’ in raising interest rates, ‘even against the backdrop of financial stability concerns and the knock-on effects of aggressive rate hikes’.

It also threw into question whether higher interest rates are putting too much pressure on smaller lenders, which could be buckling under the weight of losses on their investments.

ING Economics suggests the Bank of England will want to see more evidence that inflationary pressures are easing up more broadly before ending its cycle of rate rises.

Meanwhile, Investec Economics predicts the Bank will opt for a ‘wait-and-see approach’ and keep rates at 4% while it assesses the situation.

Economist Ellie Henderson said: ‘The MPC will have to assess which is the lesser of two evils: the risk of inflation being higher for longer or the current threat to financial stability stemming from the rapidly evolving fears of a banking crisis.’

Get in touch with our news team by emailing us at [email protected] .

For more stories like this, check our news page .

Sign Up for News Updates

Get your need-to-know latest news, feel-good stories, analysis and more

Privacy Policy

Filed Under: Uncategorized News, Bank of England, British Government, Cost of living, Money, worst rated books of all time, worst rated cars of all time, worst rated film of all time, worst rated games of all time, worst rated movies of all time, worst rated movies of all time rotten tomatoes, expected value x times y, argentina cenbank cuts key interest rate expects lower inflation, 11th time tebal, aacc 11th edition year

Snowflake IPO: In-Depth Analysis

September 11, 2020 by www.forbes.com Leave a Comment

  • Share to Twitter
  • Share to Linkedin

Snowflake is the most anticipated IPO of the year. Investors should decide in advance how much they are willing to pay as Snowflake will test the upper limits of what it means to have a stretched valuation. Heck, the company has even inspired value-legend Warren Buffet to change his thesis and invest in an IPO prior to profitability (!)

Perhaps because the company delivered sky-high revenue growth last fiscal year of 173% and 121% in the most recent quarter with a record-breaking net retention rate of 158% — which is the highest of any public cloud company at time of listing.

These industry-leading numbers are due to the company disrupting the data warehousing market with a superior cloud data platform that delivers across key differentiators (we review this below). Despite Snowflake demonstrating excellent product-market fit, clear competitive advantages, and strong management — no company is perfect. We go over a few key risks that investors should keep in mind as the bidding becomes fierce on opening day.

Snowflake Financials

Snowflake has strong financials for a tech IPO, yet it’s important to remember the product has been available for only six years and tech growth is typically strongest in the early days. The company delivered 173% growth in the fiscal year ending January 31, growing from $96.7 million to $264.7 million with gross profit margins of 56.2%.

These gross margins are below what cloud companies are capable of yet improved in the most recent period. Revenue grew 133% year-over-year in the first six months of fiscal 2021 ending in July, growing from $104 million to $242 million with improving gross profit margins of 61.5%.

In the most recent quarter, the company reported growth of 121%. Here, we already see the effects of age within a short time period as Snowflake settles from 173% growth to 133% growth and now to 121% growth. This is not a negative by any means (triple-digit growth is to be celebrated) but keep in perspective it’s age when comparing Snowflake to any high-growth cloud SaaS peers.

The bottom line has been varied depending on what period you look at. The losses doubled from fiscal year 2019 with net losses of $178 million increasing to net losses of $348.5 million in fiscal year 2020.

More recently in the first six months of fiscal 2021, the net losses were flat period-over-period at $177.2 million compared to losses of $171.3 million. This could be an encouraging sign or it could be Snowflake tightening the belt temporarily for the public offering before returning to the original pace of worsening losses. There is not enough history to know if the more encouraging flat rate of losses is sustainable. Adjusted EPS was negative $1.63 in the fiscal year ending in January compared to negative adjusted EPS of $0.72 in the first half of fiscal 2021.

Net retention rate for Snowflake is a record 158% — the highest of any company when going public. However, it’s important to remember that net retention rate lowers over time as customers become harder to retain long-term (I cover net retention rates more in-depth here).

The company was founded in 2012 yet the product came out of stealth mode in 2014. When considering the product launch, Snowflake is a very young company of only six years old.

You can see evidence of how net retention is affected by number of years in Snowflake’s S-1 filing as the company had a rate of 223% in the first half of 2019 compared to 158% period-over-period. Annually, the company lost 11 percentage points in net retention rate from 180% to 169%.

Regardless, Snowflake has impressive numbers. Perhaps the most impressive key metric in the S-1 filing is the growth in the percentage of customers with product revenue greater than $1 million. This has grown considerably from 14% in fiscal year 2019 to 41% in fiscal year 2020. There is evidence high-end accounts are continuing to grow with the first six months of 2020 at 56% compared to 22% in the year-ago period.

The new CEO, Frank Slootman, clearly knows how to make a company attractive to investors. Not only did the company quicky tighten its belt in regard to net losses, the company also doubled customers from 1,547 to 3,117 over the past twelve months. This includes 7 of the Fortune 10 and 146 of the Fortune 500.

Cash used in operating activities decreased from $110 million to $45.3 million in the first six months of fiscal 2021. The company has cash and investments of $591 million and no debt.

As outlined in the S-1 , IDC places the addressable market for Analytics Data Management and Integration Platforms and Business Intelligence and Analytics Tools at $56 billion in 2020 and $84 billion in 2023.

In an effort to narrow this addressable market, I dug up a few more sources. According to MarketsandMarkets, the addressable market for Data Warehouse-as-a-Service is much smaller at $1.2 billion in 2018 and set to grow to $3.4 billion by 2023 at a CAGR of 23.8%. P&S Intelligence reports a similar CAGR of 29.2%, estimating the Data Warehouse-as-a-Service market to reach $23.8 billion by 2030. When combining on-premise, Allied Market Research places the data warehousing market at $34.7 billion by 2025.

You’ll find larger addressable markets in tech but the weight Snowflake brings to the category is considerable.

Snowflake’s former CEO, Bob Muglia, grew the company from 80 customers in 2015 to 1000 customers in early 2018 when he was replaced by Frank Slootman. The change likely happened due to pressure from private investors who want a grand slam exit (and looks like they’ll be getting just that).

Slootman is known for resuscitating Data Domain from nearly running out of money in 2003 to an acquisition in 2009 after the company “grew to sell more than all its competitors combined.” This was detailed in a book that Slootman wrote called: “TAPE SUCKS: Inside Data Domain, A Silicon Valley Growth Story.” Three years later, Slootman took over the CEO role of ServiceNow between 2011 to 2017 and grew the company from $75 million in annual revenue to $1.5 billion. This was achieved by diversifying the product beyond the IT department.

For many investors, management is a key factor in deciding to invest or not. Here, Snowflake fires on yet another cylinder.

Product:

Snowflake’s decoupled architecture allows for compute and storage to scale separately with the storage provided from any cloud provider the customer chooses. By processing queries using massively parallel processing (MPP), where each node in the cluster stores a portion of the data set locally, the virtual warehouses can access the storage layer independently so as not to compete for compute power. With the competitors, such as Redshift, where compute and storage are coupled, more time is spent reconfiguring the cluster.

Snowflake calls this offering a virtual data warehouse where workloads share the same data but can run independently. This is crucial because Snowflake’s competitors combine compute and storage and require customers to size and pay based on the largest workload .

Data warehouses are centralized data repositories that collect and store information across many sources that are both internal and external. The raw data is ingested into the data warehouse and processed to answer queries. To ingest data, warehouses follow the ETL process, which is: (1) Extract the data from the internal or external database or file, (2) Transform by cleaning and preparing the data to fit the schema and constraints of the data warehouse and (3) Load into the data warehouse. The ETL method helps to organize the data into a relational format. Notably, Snowflake supports both ETL and ELT, which allows for data transformation during or after loading.

One key product differentiator is that Snowflake is not built on Hadoop, rather the company uses a new SQL database engine with cloud-optimized architecture . Overall, this translates to faster queries and also reduces costs by scaling up or down for both capacity and performance. This also allows the shift to the cloud while still honoring traditional relational database tools. Just like cloud infrastructure does not require you to hold server space for peak times year-round, a cloud data warehouse does not require you to plan, acquire or manage resources for peak data demand (i.e. elasticity).

The need for resources could change by either increasing or decreasing (scaling up or down). Customers that have a need for storage but less of a need for CPU computations do not have to pay up front and can shrink the environment dynamically . Users either pay for terabytes or are billed on a per-second basis for computations. Notably, Snowflake charges by execution-based usage and is not a cloud SaaS-company that charges by subscription.

Snowflake has a multi-cluster architecture which is unique from single cluster databases. The multi-cluster approach allows the clusters to access the same underlying data yet to run independently. This allows for heavy queries and operations to run very quickly and with fewer errors because the queries are not accessing the same data warehouse.

Queries are made with standard SQL, for analytics, and integrates with R and Python programming languages. The company delivers the ability to handle all incongruent data types in a single data warehouse. Because the data is accessible through SQL, there is widespread developer uptake as it’s the most common database language.

Snowflake supports both structured data and semi-structured data. As machine-generated data grows to include applications, sensors and mobile devices, Snowflake allows semi-structure data to be handled without preparation or schema definitions. The result is handling JSON, Avro, ORC, Parquet or XML data as if it were relational and structured.

Snowflake uses a compressed columnar database. Columnar databases are optimized for the fast retrieval of columns of data and is used for analytic data queries. Other features include centralized metadata management that is stored in a single-key value store that allows cloning of tables and databases. Security is baked into the platform to where Snowflake automatically encrypts all data to the point where unencrypted data is not even allowed. There is third-party certification and validation for security standards like HIPAA.

Beyond the value proposition of separating storage from compute for speed, and also scaling up or down to reduce costs, the third takeaway is that Snowflake is also much easier for customers to use as it’s designed to remove the role of a database administrator for monitoring and/or to tune query performance.

The end goal of choosing Snowflake is that you load data, run queries, and do little else – which is an immense value proposition due to the amount of time wasted prepping, balancing, tuning and monitoring traditional data warehouses originally built for on-premise.

Snowflake is capitalizing on the multi-cloud trend and growing rapidly with customers who want a choice in public cloud provider despite the cloud giants having their own data warehouse systems, such as Amazon Redshift, Azure Synapse and Google Big Query.

Generally speaking, Big Query is a closer competitor as Google’s offering also separates storage and compute. The differences between BigQuery and Snowflake include pricing structure where Snowflake is a time-based pricing model where users are charged for execution time and BigQuery is a query-based pricing model, where users are charged for the amount of data returned from the queries. BigQuery has a serverless feature that makes it easier to begin using the data warehouse a the serverless feature removes the need for manual scaling and performance tuning. Dremel is the query engine for BigQuery.

When it comes to deciding between BigQuery and Snowflake, it can come down to what you do with the database due to pricing structure differences. For instance, Snowflake is a better choice for concurrent users and business intelligence. It’s also a great choice for data-as-a-service, where you might give client access to your data in the form of analytics. BigQuery is perhaps a better choice for ad hoc reporting, where you have occasional complex reports on a quarterly basis or recommendation models and machine learning that require high idle time. Again, these examples are mainly due to pricing structure.

Despite BigQuery having a strong following with nearly twice the number of companies as Snowflake and growing around 40% , it tested slower than Snowflake in field tests performed by GigaOm in 2019 . Vendor lock-in from BigQuery is also undesirable as companies may prefer AWS or Azure and/or more interoperability or best-in-breed solutions – we can see this in the growing trend of multi-cloud. AWS Redshift has the biggest market presence but growth is nearly flat at 6.5% and AWS is the leading partner for Snowflake.

Here’s a great write-up from the Hashmap Engineering and Technology Blog that points out why implementing optimized row columnar (ORC) format data loads is ideal for either Snowflake or Amazon Redshift due to the ORC file format. Again, ultimately the choice in which system you use comes down to the individual needs for implementation although Snowflake is designed to be a competitor in nearly every case.

There’s a great write-up from analyst David Vellante that discusses how Snowflake competes with cloud native database giants. His analysis discusses survey responses from CIOs and IT buyers with Snowflake having a lead over the tech giants in spending intentions. The Enterprise Technology Research study he highlights showed 80% of AWS accounts plan to spend more on Snowflake in 2020 relative to 2019 with 35% adding Snowflake as new compared to 12% adding Redshift as new. In Azure, 78% plan to spend more on Snowflake with 41% adding new. On Google Cloud, 80% plan to increase spending on Snowflake. We can see the people have spoken.

A few risks …

Due to Snowflake’s product strengths, the public cloud providers offer Snowflake while at the same time being in competition. The main risk being discussed is that public cloud providers have competing databases, but in reality, the risk may be pricing pressure over time. Snowflake has a great top line; however, the bottom line is affected by its partnership with the competitors. Plus, tech giants can greatly undercut Snowflake on pricing. Therefore, margins may be an inherent issue.

The company pays quite a bit for sales and marketing, which is typical for a company going public as this strengthens the top line yet could make it hard to balance this growth with profitability in the future. (But hey, if Berkshire doesn’t care, why should we!)

In the S-1 filing, it was noted that Salesforce will buy $250 million in stock in a private placement. This could be a risk if Salesforce becomes too intertwined with Snowflake as it’s best possible growth will be achieved by stating neutral, in my opinion. This involvement is something to monitor.

As stated, Berkshire Hathaway is also intending to purchase $250 million in shares in a private placement plus an additional $300 million from an unnamed stockholder in a secondary transaction. As Business Insider pointed out, this involvement from Berkshire is “rarer than a unicorn” and will be viewed as a strength by both institutions and retailers.

There could be risk in Snowflake being cloud-native only and not offering hybrid or on-premise. This can limit the customer pool as enterprises prefer hybrid options. Perhaps the bigger picture for Snowflake’s strength will be leveraging artificial intelligence in applications and business intelligence, and in this case, a hybrid and on-premise offering won’t be as necessary.

Valuation

Snowflake’s amended filing on September 8 th shows the company will be priced at $75 to $85 per share with a valuation between $20.9 billion and $23.7 billion. This would raise $2.7 billion. The last private valuation for Snowflake was $12.5 billion when the company raised a Series G for $479 million.

When we look at various scenarios, we see Snowflake hitting 40 forward price-to-sales in the $30 billion valuation range.

Snowflake is not profitable while Shopify, Zoom Video and Datadog are profitable with some showing accelerating revenue. These three have commanded above a 40 forward price-to-sales in perfect conditions only. The majority of their trading history has been beneath a 30 forward price-to-sales. Being profitable should come with a premium yet Snowflake will likely inch its way into this valuation range without demonstrating profitability.

When we look at Zoom Video, Crowdstrike and Datadog, we see these three traded at or beneath their opening IPO price many times in the year following IPO. Crowdstrike saw roughly a 50% drawdown from its opening price.

Therefore, if Snowflake trades at a 30 forward price-to-sales and sustains this valuation, it will be the first high growth company with negative earnings to do so. Even those with positive earnings growth have only traded above this valuation for a brief period over the last three months.

A better strategy would be not pay over this amount and count on history rhyming. At NTM revenue of $750 million, that means Snowflake would have to open at the price listed in the prospectus in order to remain within a reasonable $25 billion valuation (“reasonable” being used loosely here as only a few companies have traded at this valuation in the most ideal conditions/tech market and these comparables were profitable).

Conclusion:

When you were a child, your parents probably asked, “are you going to jump off a bridge if everyone else does?” The goal of the question was to get you to think for yourself in the face of peer pressure.

In this situation, the question that should be asked is, “are you going to invest in a company with triple-digit growth, clear product differentiation, key metrics that prove product-market fit and gravity-defying management … if Berkshire does?” The answer is probably “yes.”

The issue is that we aren’t Berkshire or Salesforce so we will probably overpay. Therefore, the biggest risk of all is how much alpha will be left in the first year of trading by the time retailers are offered the crumbs.

I’ve participated in IPOs out the gate and the only ones that have paid off were under-hyped (Roku). Those that were over-hyped, such as Zoom Video and Crowdstrike, either retreated back to their opening price or saw up to a 50% haircut from the opening price.

I did not participate in either of these over-hyped IPOs but I did snag Zoom Video later in January of 2020. I was able to put that money to use elsewhere while waiting for the lock-up period to expire and the right entry in the low $60s nearly 9 months after Zoom Video had listed.

Even as a Snowflake enthusiast. I may back-off after 30 forward price-to-sales (and most certainly at 40 forward P/S) as I’m confident I can find many great tech companies that are less hyped while I wait it out. We will always see periods of indiscriminate selling across high-growth and I don’t think Snowflake will escape those rotations.

The information contained herein are opinions and not financial advice. Beth Kindig owns shares of Microsoft, Zoom Video, Shopify and Datadog. She may initiate a position in Snowflake.

Filed Under: Uncategorized Snowflake IPO, Snowflake stock, Snowflake company, Growth stocks, Cloud stocks, Data stocks, Cloud..., ipo performance analysis

Copyright © 2023 Search. Power by Wordpress.
Home - About Us - Contact Us - Disclaimers - DMCA - Privacy Policy - Submit your story