“`html
Arista Networks (NYSE: ANET)
Q3 2024 Earnings Call
Nov 07, 2024, 4:30 p.m. ET
Overview of the Earnings Call
- Opening Remarks
- Q&A Session
- Participants on the Call
Opening Remarks:
Operator
Good afternoon, and welcome to Arista Networks’ third quarter 2024 earnings conference call. Please note that all participants will be in listen-only mode during the presentation. Following it, we’ll hold a question-and-answer session with instructions provided at that time.
[Operator instructions] This call is being recorded and will be available for playback in the Investor Relations section of Arista’s website. I would now like to turn the call over to Ms. Liz Stine, Arista’s Director of Investor Relations.
Liz Stine — Director, Investor Relations
Thank you, operator. Good afternoon, everyone. We appreciate you joining us today. On this call, I have with me Jayshree Ullal, Arista Networks’ CEO; and Chantelle Breithaupt, our CFO. Earlier today, we released our earnings results for the fiscal third quarter ending September 30, 2024. You can find this release on our website.
During this call, management will share forward-looking statements about our financial outlook for Q4 2024 and longer-term projections for 2025 and beyond. This will include insights into our market strategy, AI initiatives, customer demand trends, and pressures from the supply chain. Please note that actual results could differ from what we discuss, as detailed in our SEC filings.
Financial Highlights
Jayshree Ullal — Chair and Chief Executive Officer
Thank you, Liz, and thank you all for joining our call. In Q3 2024, we recorded revenues of $1.81 billion and achieved a record non-GAAP earnings per share of $2.40. Renewals from services and software support comprised about 17.6% of our revenue. Our non-GAAP gross margin for the quarter stood at 64.6%, influenced both by competitive pricing in the cloud sector and favorable margins in enterprise services.
International sales accounted for about 18% of our revenue, while the Americas remained strong at 82%. We are optimistic about the company’s performance and plans for the future. Earlier this year, during our 10th anniversary celebration, we shared comprehensive insights that would typically be reserved for an Analyst Day, touching upon our strategies for 2025 and beyond.
Our Vision Moving Forward
We believe that the role of networks in critical transactions is expanding. Our Arista 2.0 strategy positions us as a leader in this evolving landscape. Our networking platforms enable the transition from isolated systems to comprehensive data centers—these can include data centers, campus locations, and AI centers.
Central to our efforts is the EOS software stack, designed for multi-modal datasets. From this foundation, we support AI and machine learning initiatives, ensuring our clients have the data they need to succeed in these areas. Arista’s architecture facilitates efficient client-to-cloud networking and drives our innovations for AI.
Key elements of our network strategy include:
- High availability products with built-in resilience and seamless upgrades.
- Zero-touch automation that enhances operations and reduces reliance on staff.
- Advanced insights for AI networking, delivering powerful algorithms for security and performance analysis.
As we move through 2024, we anticipate greater focus on AI networking, moving from initial trials to pilot programs involving extensive GPU connections. This approach should solidify our standing in the market while meeting the increased demand for AI infrastructure.
Insights on AI Networking
Our understanding of AI traffic flows is crucial as networking solutions evolve. Effective communication across various layers is vital, ensuring that slow data flows do not hinder overall job performance. Consequently, our AI infrastructures must efficiently connect backend systems with frontend operations to support diverse workloads.
Arista is becoming a leader in scalable Ethernet technologies, particularly for large-scale AI training. Our new portfolio offers 800-gig throughput and exceptional performance, supporting networks designed for up to 100,000 GPUs and beyond.
“`
Arista Networks Launches Game-Changing AI Networking Solutions at OCP
New Developments Enhance High-Density GPU Clusters with Advanced Switching Technology
The accelerated AI networking portfolio of Arista Networks now includes three series that collectively offer over 20 switching products, moving beyond a single-switch focus. At the recent Open Compute Project (OCP) event held in mid-October 2024, the company introduced a unique platform, the Distributed Etherlink 7700, designed to facilitate two-tier networks capable of supporting up to 10,000 GPU clusters.
The 77R4 Distributed Etherlink Switch was created in partnership with Meta, embodying innovation in AI networking. Although it resembles a traditional two-tier leaf and spine network, the DES offers single-stage forwarding and an efficient spine fabric. This setup eliminates the need for complex tuning and streamlines failover processes for large-scale AI accelerator clusters. The new switch enhances Arista’s flagship 7800 AI spine, which is tailored for maximal scalability with unique traffic management features and a sophisticated queuing system, ultimately preserving critical AI processing resources and speeding up job completion.
John McCool, Arista’s Chief Platform Officer, delves deeper into the company’s 2024 platform and the recent supply chain advancements during the presentation.
John McCool – Senior Vice President, Chief Platform Officer
Thank you, Jayshree. I’m excited to share that the Arista 7700R4 Distributed Etherlink Switch and 7800R4 spine, along with the 7060X6 AI leaf announced in June, have officially entered production. This provides our customers with an expansive array of 800-gigabit-per-second Ethernet options tailored for AI networks. By utilizing 800-gigabit-per-second parallel optics, users can now connect two 400-gigabit-per-second GPUs to each port, greatly increasing deployment density compared to existing solutions. This extensive selection enables our customers to optimize their network configurations effectively, reducing operational complexity.
As AI deployments ramp up, customers are also re-evaluating their front-end networks. New AI clusters demand high-speed port connections to the existing infrastructure. Additionally, the uplift in bandwidth is necessary for training data access and swift result generation from the clusters. This trend is boosting demand for our 7800R3 400-gigabit solution.
Though post-pandemic supply chains have stabilized, the procurement timeline for advanced semiconductors remains longer than pre-pandemic norms. To ensure sufficient availability of high-performance switching silicon, we have increased our purchase commitments for these critical components. Additionally, we are ramping up stock levels to cater to the fast deployment of new AI networks, which should help decrease lead times as we progress into the next year. Our supply chain team is collaborating closely with planning departments to synchronize component arrivals with anticipated customer orders.
Next-generation data centers incorporating AI face challenges with increased power consumption while striving to double network performance. Our integrated electrical and mechanical design processes allow us to make strategic design decisions across various systems to optimize our solutions. Our collaborative work with leading cloud companies has provided crucial insights into diverse switch configurations necessary for optimized data center setups. Moreover, our advanced software capabilities, including SDK integration, diagnostics, and data analytics, are geared to expedite design and production processes, ensuring first-time success in deployment.
We remain confident in executing our strategic roadmap within this rapidly changing AI networking landscape. Back to you, Jayshree.
Jayshree V. Ullal – Chair and Chief Executive Officer
Thank you, John, and congratulations on a highly successful year to you and the entire executive team, including Alex Rose, Mike Kappus, and Luke Colero. Your efforts have been remarkable. A significant aspect of AI networking’s swift adoption is the upcoming Ultra Ethernet Consortium specification, which Arista has contributed to as a founding member. The UECE ecosystem now boasts over 97 members.
In our opinion, Ethernet remains the only sustainable pathway for open standards-based AI networking. Arista is committed to establishing comprehensive AI centers, leveraging our unparalleled EOS (Extensible Operating System) and the superior automation capabilities provided by CloudVision. Our EOS delivers dynamic traffic management through cluster load balancing solutions and allows for system upgrades without interrupting ongoing traffic, thus facilitating uninterrupted data flow during upgrades. We work with various AI accelerators and are committed to providing advanced EOS visibility down to the host level.
Turning our attention to 2025 goals, as discussed during our New York Stock Exchange event in June, we have projected revenues to expand to $70 billion by 2028. Notably, we achieved a remarkable 33.8% growth in 2023, and forecasts for 2024 suggest an at least 18% growth rate, surpassing earlier estimates of 10% to 12%. This increase is driven by accelerated AI pilot programs, positioning us for anticipated annual growth of 15% to 17%, translating to approx. $8 billion in 2025 revenue.
We expect to meet our 2025 targets of $750 million each for campus and backend AI networking, which we set a couple of years ago. However, it is important to note that AI’s backend operations significantly influence the frontend network and its configurations. In some cases, the ratio can reach between 30% to 200%, depending on the training requirements. This means our overall AI center networking target could potentially reach approximately $1.5 billion by 2025.
Our strategy involves maintaining a trajectory of double-digit growth and a three-year compound annual growth rate (CAGR) in the teens from 2024 to 2026. More details will follow from our Chief Financial Officer. So now, I hand it over to Chantelle.
Chantelle Breithaupt – Chief Financial Officer
Thank you, Jayshree. I will now provide an overview of financial performance based on our Q3 results and guidance for Q4 fiscal year ’24, referencing non-GAAP metrics. This excludes non-cash stock-based compensation, amortization of intangible assets, and other one-off items. A detailed reconciliation of our selected GAAP to non-GAAP results is available in our earnings release.
Total revenues for the quarter were $1.81 billion, which is a 20% increase compared to the prior year. This strong performance exceeded our guidance, which was set in the range of $1.72 billion to $1.75 billion. Subscription services and software contributed about 17.6% to our revenues this quarter.
International revenues amounted to $330.9 million, or 18.3% of total revenue, a slight decline from 18.7% last quarter. This decrease reflects a higher contribution from domestic sales to our cloud and enterprise clients. The overall gross margin in Q3 stood at 64.6%, above the upper limit of our forecast, and marked a recovery from 63.1% from Q3 last year.
Operating expenses for the quarter were $279.9 million, accounting for 15.5% of revenue, a decline from $319.8 million the previous quarter. Research and development expenses were $177.5 million, or 9.8% of revenue, down from $216.7 million last quarter. It’s important to note that additional R&D expenses anticipated in Q3 are now expected to occur in Q4. The R&D headcount showed low double-digit percentage growth compared to Q3 from the prior year.
Strong Q3 Performance and Strategic Plans Ahead for Arista Networks
Substantial Revenue and Income Growth
Arista Networks reported impressive financial results for the quarter. Sales and marketing expenses amounted to $83.4 million, representing 4.6% of revenue, a slight decrease from the previous quarter. General and administrative costs stood at $19.1 million, or 1.1%, maintaining similar levels as last quarter. Operating income reached $890.1 million, constituting 49.1% of revenue—boosted by the anticipated delay of R&D-related expenses now expected to occur in Q4 of this year.
Net Income and Earnings Per Share Rise
The company recorded a favorable other income and expense figure of $85.3 million, with an effective tax rate of 21.1%. This led to a net income for the quarter of $769.1 million, which is 42.5% of revenue. With 325.5 million diluted shares outstanding, the diluted earnings per share hit $2.40, marking a considerable increase of 31.1% compared to the previous year, also influenced by the R&D expense shift.
Healthy Balance Sheet and Share Buyback Program
As for cash reserves, Arista concluded the quarter with approximately $7.4 billion in cash, cash equivalents, and investments. During this time, the firm repurchased $65.2 million of its common stock at an average price of $318.14 per share. Of the $1.2 billion share repurchase program initially approved in May 2024, $1 billion remains available for future buybacks. Future repurchases will depend on various market and business conditions.
Strong Cash Flow and Inventory Management
Operating cash flow for the third quarter was about $1.2 billion, showcasing robust earnings along with effective working capital management. Days sales outstanding (DSOs) improved to 57, down from 66 days in Q2, thanks to better collections. Inventory turns were also positive, increasing from 1.1 to 1.3 times, with inventory falling to $1.8 billion from $1.9 billion earlier, indicating a reduction in raw materials inventory. Total purchase commitments and inventory were at $4.1 billion, demonstrating a slight rise from $4 billion at the end of Q2.
Deferred Revenue and Future Product Initiatives
The total deferred revenue balance reached $2.5 billion, rising from $2.1 billion in Q2, largely due to service-related contracts. Notably, product deferred revenue increased about $320 million compared to the last quarter. As fiscal 2024 progresses, multiple new product launches, client acquisitions, and enhanced usage scenarios have led to more customer trials, influencing these deferred revenue balances.
Q4 Guidance and Strategic Investments
Looking ahead to Q4, Arista provided guidance based on non-GAAP results, excluding noncash stock-based compensation and other one-time items. Expected revenue lies between $1.85 billion and $1.9 billion, with projected gross margins of approximately 63% to 64% and an operating margin near 44%. The effective tax rate is anticipated at around 21.5%, with diluted shares expected to be roughly 321 million pre-split. While operating cash has seen significant growth recently, an increase in working capital requirements is expected primarily due to rising inventory levels necessary for AI networks as 2025 approaches.
Stock Split and Future Growth Outlook
Arista announced a four-for-one stock split approved by its board, aimed at enhancing stock accessibility for retail investors. It’s essential to understand that while the number of shares will increase, the company’s intrinsic value remains unchanged. Transitioning into fiscal 2025, Arista anticipates a revenue growth of 15% to 17%, with a shifting focus on cloud and AI customers leading to gross margins of 60% to 62% and operating margins of approximately 43% to 44%. The firm continues to emphasize investments in R&D and scaling operations, targeting around $8 billion in revenue by 2025.
In conclusion, Arista reaffirms a commitment to double-digit growth and aims for a compounded annual growth rate (CAGR) in the mid-teens for fiscal years 2024 through 2026. The company is positioning itself strongly to leverage current market conditions and deliver returns to shareholders. Now, we will open the floor for questions.
Liz Stine — Director, Investor Relations
Thanks, Chantelle. We will now transition to the Q&A segment of the Arista earnings call. To ensure everyone can participate, please limit yourselves to one question. Thank you for your understanding.
Operator
The Q&A segment is now live. Our first question is from Somnath Chatterjee with JPMorgan. Please proceed.
Samik Chatterjee — Analyst
Hello, and thank you for taking my question. It’s great to see such robust results. Regarding guidance, I noticed you’re targeting a $750 million rate of return along with campus revenue goals. However, this suggests that the non-AI and non-campus segments may experience only single-digit growth next year. Considering the strong double-digit performance in 2024 following backlog challenges, could you elaborate on the expected decline in those sectors and the factors influencing this outlook? Thank you.
Jayshree V. Ullal — Chair and Chief Executive Officer
Thank you, Samik. As you know, we typically have visibility for about six months, so we are cautious not to speculate too far ahead. As we entered 2024, we experienced unexpected acceleration in AI projects, particularly among major cloud clients, which influences our outlook going forward.
Arista Networks Provides Insight on AI Trials and Competitive Landscape
Market Trends in AI Trials and Customer Engagement
During a recent call, Arista Networks hinted at their progress with current AI trials. CEO Jayshree V. Ullal indicated that the company is performing well across various customer segments, although challenges remain in some areas. “We believe we’re actually five out of five in trials,” Ullal noted, reflecting an optimistic outlook.
This performance involves four major customer clusters, where three are expected to transition to larger-scale pilots this year, anticipating GPU clusters of up to 100,000 by 2025. However, one customer is experiencing delays, primarily due to issues related to new GPUs and power cooling capabilities. Ullal praised the overall trajectory of the trials, stating, “Three of these I would give an A, while the fourth is getting started.”
NVIDIA’s Market Expansion and Competitive Positioning
When examining the competitive dynamics, analyst Tal Liani raised a question about NVIDIA’s growing presence in the data center market, noting that their share increased from 4% to 15%. Ullal acknowledged NVIDIA as a partner, stating, “If we didn’t have the ability to connect to their GPUs, we wouldn’t have all this AI networking demand.” However, she also highlighted that the real competition emerges in Ethernet capabilities, where Arista positions itself as the leading expert.
Ullal cited Arista’s comprehensive portfolio that caters to both large-scale enterprises and smaller GPU clusters. Despite increasing competition, she emphasized their commitment to maintaining a competitive edge in Ethernet switching.
Broadening the Competitive Landscape Amid AI Advances
Analyst Simon Leopold sought broader insights into the competitive landscape, including insights on competitors like Cisco and Juniper, particularly in AI. Ullal explained that the landscape varies significantly based on backend and frontend needs. The back end, with its native GPU connections, remains complicated with established players like InfiniBand. Arista aims to gain credibility and experience in this sector but does not claim market leadership yet.
In contrast, Ullal touted Arista’s stronger position on the front end, citing their robust networking solutions as superior to those of their competitors. She highlighted their advancements in technology and potential partnerships, ensuring that Arista is well-equipped to navigate the complexities brought about by rising AI demands.
Ullal concluded confidently, “Competitively, I would say we’re doing extremely well in the front end, and it’s incremental on the back end.” The company’s evolution over the past 12 years enables it to tackle these challenges effectively as it continues to strengthen its market position.
Conclusion
As Arista Networks navigates the shifting currents of the AI landscape, its ability to adapt and respond to market changes presents a promising future. The company’s focus on evolving customer needs and competitive differentiation underscores its strategy in the thriving tech sector.
A Look at Arista Networks’ Future and AI Investments
Assessing AI Visibility and Customer Progress
I wanted to ask a little bit more about the $750 million in AI for next year. Has your visibility on that improved over the last few months? I wanted to reconcile your comment around the fifth customer not going slower than expected. It sounds like you’re now in five on five, but I’m curious if that fifth customer going slower is limiting upside or your visibility there. Or has it actually improved, and has it become more conservative over the last few months? Thanks a lot.
Jayshree V. Ullal — Chair and Chief Executive Officer
Somebody has to bring up conservative, Ben, but I think we’re being realistic. You said it right: we have good visibility with at least three out of the five, for the next six months, maybe even 12. John, what do you think?
John McCool — Senior Vice President, Chief Platform Officer
Yeah.
Jayshree V. Ullal — Chair and Chief Executive Officer
Regarding the fourth customer, we are in early trials and need improvements. Let’s see; however, we aren’t looking for 2025 to be a standout year for the fourth one—it’s probably more likely to be 2026. As for the fifth, we’re somewhat stalled, which explains our caution in predicting their performance.
They may come through in the second half of ’25, and we will keep you updated. However, if they don’t, we’re still feeling confident about our guidance for ’25, right, Chantelle?
Chantelle Breithaupt — Chief Financial Officer
I totally agree. That’s a good question, Ben. I think the way Jayshree categorized them makes sense.
Ben Reitzes — Melius Research — Analyst
OK. Thanks a lot, guys.
Operator
Our next question comes from the line of Karl Ackerman with BNP Paribas. Please go ahead.
Karl Ackerman — Analyst
Yes. Thank you. Jayshree, could you discuss your engagement with hyperscalers? Will the new Etherlink switches and AI spine products be deployed on 800-gig ports? In essence, have these pilots or trials primarily been on 400-gig, or is production moving beyond 800-gig? If so, what’s the expected hardware mix for 800-gig sales in ’25?
Jayshree V. Ullal — Chair and Chief Executive Officer
That’s a great question. It has always been tough to differentiate between 100 and 400 because someone can split their 400 into 100s. As of today, if you ask John and me, most trials and pilots are on 400-gig, because people are still waiting for the complete ecosystem at 800, including NICs, UECs, and packet spring capabilities. So, while we are in early trials on 800, most activity is with 400.
Looking ahead to 2024, most efforts will be on 400-gig, but I expect a better balance between 400 and 800 as we move into ’25.
Karl Ackerman — Analyst
Thank you.
Jayshree V. Ullal — Chair and Chief Executive Officer
Thank you, Karl.
Operator
Your next question comes from the line of Ryan Koontz with Needham and Company. Please go ahead.
Ryan Koontz — Analyst
Great. Thanks for the question. Can we discuss your campus opportunities a bit? Where do you see the most traction with your applications? Is this mainly due to your core strength in handling large data transfers around the campus, or are you seeing interest in WiFi? Please update us on the campus applications and verticals you’re witnessing growth in.
Jayshree V. Ullal — Chair and Chief Executive Officer
Thank you. Let me take a step back: our enterprise opportunity has never been stronger. As a pure-play innovator, we are increasingly invited into enterprise deals, even if we don’t always have the sales coverage. Clients are seeking network designs that avoid multiple operating systems and silos, and there’s significant competitive fatigue amid industry consolidation.
We no longer view our enterprise opportunities as centered solely in data centers. There’s data center, campus, WAN, and, of course, some AI integration as well. Now, to answer your campus question more directly, many of our clients are excited about using our universal spine, as it means they can implement the same spine in their data and campus centers. This confidence contributes significantly to our $750 million projection, based on their established platform and future spine projects.
John McCool, if Kumar Srikantan were here, he’d emphasize the importance of measuring edge ports, including power Ethernet, wired, and WiFi. This distinction is crucial.
John McCool — Senior Vice President, Chief Platform Officer
Sounds like Kumar.
Jayshree V. Ullal — Chair and Chief Executive Officer
Exactly. To summarize, we have strong initiatives in the spine; we’re progressing well on the wired side, but our experience with WiFi is still growing. We are working to address that gap.
Ryan Koontz — Analyst
Very helpful, Jayshree.
Jayshree V. Ullal — Chair and Chief Executive Officer
Chantelle, do you want to add anything?
Chantelle Breithaupt — Chief Financial Officer
In addition to campus applications, we’re seeing strength in sectors like data centers and financials, healthcare, media, and retail.
Jayshree V. Ullal — Chair and Chief Executive Officer
Fed and SLED markets are also important. Historically, this was an area we hadn’t focused on, but now we’re taking it seriously, including setting up a dedicated subsidiary. Chantelle, your efforts in this regard are greatly appreciated.
Chantelle Breithaupt — Chief Financial Officer
Thank you.
Jayshree V. Ullal — Chair and Chief Executive Officer
Thank you, Ryan.
Operator
Our next question will come from the line of Amit Daryanani with Evercore. Please go ahead.
Amit Daryanani — Analyst
Good afternoon. Thanks for taking my question. I hope you could delve into the sizable increases we are seeing in both total deferred revenue and product deferred revenue. Jayshree, historically, when product deferred revenue rises sharply, it often leads to strong revenue acceleration in later years, and you are guiding…
“`html
Arista Networks Discusses Future Growth Amid Margin Expectations
Jayshree V. Ullal — Chair and Chief Executive Officer
When considering revenue projections for 2025, we must understand the differences in product timelines and the factors influencing acceleration.
Jayshree V. Ullal — Chair and Chief Executive Officer
Let me turn this over to Chantelle, our CFO, for more detailed insights. Remember that the trials we’ve referenced usually last between six to twelve months; our current projects may span several years, so the results may not materialize by 2025. Chantelle, please take it from here.
Chantelle Breithaupt — Chief Financial Officer
Thank you, Jayshree. The factors at play involve the use case types, customer demographics, and product mixes, which each have their own timelines, as Jayshree mentioned. We’re starting to see those timelines extend. Additionally, we observe ongoing deferrals and adjustments every quarter. As we approach 2025, we will provide updates on these variables.
Jayshree V. Ullal — Chair and Chief Executive Officer
Thank you, Amit.
Operator
Our next question comes from Meta Marshall with Morgan Stanley. Please proceed.
Meta Marshall — Analyst
Thank you. Jayshree, I’m interested in your plans for pursuing Tier 2 opportunities and other customers that are heavily investing in AI. How do you envision these opportunities evolving for Arista?
Jayshree V. Ullal — Chair and Chief Executive Officer
This is a pertinent question, Meta. We’re focused on five key trials that we believe can scale up to 100,000 GPUs or more. It’s important to consider that these include significant AI leaders, which may be housed in both cloud environments and Tier 2 sectors. While we currently have 10 to 15 smaller trials in traditional enterprises, we concentrate on those larger trials that significantly influence our metrics and market share in AI, though we definitely see potential beyond these five.
Meta Marshall — Analyst
Great, thank you.
Jayshree V. Ullal — Chair and Chief Executive Officer
Thank you.
Operator
Our next question comes from Sebastien Naji with William Blair. Please proceed.
Sebastien Naji — William Blair and Company — Analyst
Good evening, and thanks for taking my question. I’d like to discuss the Ethernet or Etherlink portfolio, specifically, can you rank or assess the opportunities within each of the three families: the single-tier lease spine and the tolling switch as we look to 2025 and beyond?
Jayshree V. Ullal — Chair and Chief Executive Officer
I’ll tackle that, but John can assist with details, as this requires some estimation. Regarding the Etherlink, I would say the fixed 7060 switches are popular due to their familiarity and design, developed in collaboration with Broadcom through the Tomahawk series. In contrast, the 7800 model, while lower in unit volume, offers a crucial strategic advantage in revenue. Lastly, the 7700 combines features from the other two, targeting up to 10,000 GPUs. It’s new and unique, offering configuration that competes with no one else in the market.
John McCool — Senior Vice President, Chief Platform Officer
To add to that, the 7700 draws interest due to its large-scale capabilities. Between the 7060 and 7800, we see customers mixing these products to maximize GPU utility while minimizing complexity.
Jayshree V. Ullal — Chair and Chief Executive Officer
That’s a good point. Organizations often shift from a four-way to an eight-way configuration, which leads them to request more supplies and features.
Sebastien Naji — William Blair and Company — Analyst
Thank you. I appreciate the insights.
Operator
Our next inquiry comes from Aaron Rakers with Wells Fargo. Please continue.
Aaron Rakers — Analyst
Thanks for taking my question. I’d like to transition to the competitive landscape and specifically inquire about the gross margin outlook you provided for 2025 along with your mid-term model. Could you elaborate on the factors contributing to your expectations for margin declines?
Chantelle Breithaupt — Chief Financial Officer
Absolutely, Aaron. The projections you referenced are largely influenced by customer mix. We expect John to maintain effective supply chain management as he has been doing.
Pricing dynamics also play a role; as we increase our customer base, growing margins becomes challenging. We’ll continue assessing these factors as we progress through 2025 and 2026.
Jayshree V. Ullal — Chair and Chief Executive Officer
It’s important to stay vigilant about these trends and how they may evolve.
“““html
Arista Networks Sheds Light on AI’s Role and Financial Projections in Q3 Earnings Call
Overview of Ongoing Strategies and AI Investments
During the recent earnings call for Arista Networks, Jayshree V. Ullal, the Chair and Chief Executive Officer, shared insights about the company’s financial strategies and expectations for future growth driven by artificial intelligence (AI).
Aaron Rakers — Analyst
Thank you.
Jayshree V. Ullal — Chair and Chief Executive Officer
Thanks, Aaron.
Liz Stine — Director, Investor Relations
Operator, we have time for one last question.
Understanding the Impact of AI Investments on Revenues
Operator
Our final question will come from the line of Atif Malik with Citigroup. Please go ahead.
Atif Malik — Analyst
Hi. Thank you for taking my question. Jayshree, at some of the recent conferences, you mentioned that for every dollar spent on back-end operations, there is at least a 2x return on the front end. What specific signs are you monitoring to gauge the benefits of AI on the front end versus the pressures affecting bandwidth from traditional cloud systems?
Jayshree V. Ullal — Chair and Chief Executive Officer
Good question, Atif. The effectiveness of AI investments greatly depends on how they are approached. If a company focuses solely on building a back-end cluster with the aim of proving its capabilities, they will likely prioritize completing high-importance training jobs with narrow applications.
What we are increasingly observing, however, is that for every dollar allocated to the back end, companies may find themselves expanding front-end spending by 30%, 100%, or sometimes even 200%. This translates our forecasted $750 million for next year into similar expectations for increased front-end traffic. While this will partly include AI, other factors will be involved as well, so we anticipate a range between 30% and 100% growth.
Notably, the average return is around 100%, which means a doubling of our back-end investments. Accurately attributing this growth solely to AI remains challenging. As we incorporate inference training and front-end storage alongside classic cloud operations, isolating the pure AI contribution complicates the analysis.
Atif Malik — Analyst
Thanks so much.
Liz Stine — Director, Investor Relations
This concludes the Arista Networks third quarter 2024 earnings call. A presentation detailing our results is available in the Investor section of our website. We appreciate your interest in Arista.
Operator
[Operator signoff]
Duration: 0 minutes
Participants of the Earnings Call
Liz Stine — Director, Investor Relations
Jayshree V. Ullal — Chair and Chief Executive Officer
John McCool — Senior Vice President, Chief Platform Officer
Chantelle Breithaupt — Chief Financial Officer
Various Analysts — Including Samik Chatterjee, Antoine Chkaiban, Tal Liani, and others
Atif Malik — Analyst
For more on Arista Networks, analysts can access extensive analysis and transcripts of all earnings calls.
This article consists of a transcription from the conference call prepared for The Motley Fool. While we aim for accuracy, there may be errors. The Motley Fool does not assume responsibility for your use of this information and encourages further research, including listening to the call directly and reviewing SEC filings. Please refer to our Terms and Conditions for more details.
The Motley Fool owns shares in and recommends Arista Networks. The Motley Fool has a disclosure policy.
The views expressed here are solely those of the author and don’t necessarily reflect the opinions of Nasdaq, Inc.
“`