How to size CyberArk PSM storage using a simple, reliable formula

Understand how to size a PSM server's storage using the formula: (Maximum concurrent sessions) × (Average length of recorded session) × (Average bit rate of recorded video) + 20 GB. This helps you plan for concurrency, session duration, and video quality while accounting for a practical overhead and future growth.

Sizing PSM storage without the guesswork

If you’re dealing with CyberArk’s Privileged Session Manager (PSM) in a real environment, storage is one of those things you can’t ignore. Too little, and you’ll trip over performance gaps or fail to retain recordings; too much, and you’re feeding a future cost problem. Here’s a straightforward way to think about how much storage a PSM server really needs—the formula that ties together how many sessions can run at once, how long each session lasts, and how data-heavy the recordings are.

The three things that matter most

Let me explain what goes into the storage calculation. There are three core factors, and each one maps to a practical, observable reality in your environment:

  • Maximum number of concurrent sessions

This is the ceiling of how many user sessions might be captured at the same time. If you’ve got a busy admin floor or a high-security environment with multiple teams, you’ll see more simultaneous sessions. More concurrent sessions means more data being produced at any given moment.

  • Average length of the recorded session

Some sessions are quick—a few minutes to verify something—while others stretch out for an hour or more. The longer the sessions, the more data gets generated per session.

  • Average bit rate of the recorded video

The quality of the recording (and the compression you choose) determines how many bits per second are written to disk. A higher bit rate gives you crisper detail but dramatically increases storage needs. Lower bit rates save space but can compromise visibility.

And there’s a baseline: 20 GB. That baseline isn’t arbitrary. It accounts for the operating system footprint, metadata, logs, and other overhead that isn’t captured by the raw session data itself.

The formula in plain terms

The required storage, in simplified form, is:

Maximum number of concurrent sessions × Average length of recorded session × Average bit rate of recorded video + 20 GB

That’s it. Each piece plugs into the next to give you a concrete number you can allocate in your storage plan. It’s not magic; it’s a practical way to translate a busy environment into a trusted capacity figure.

A quick worked example (with numbers you can relate to)

Let’s run a hypothetical to make this concrete. Suppose your environment can have up to 8 concurrent sessions. Each session, on average, runs for 30 minutes (which is 1800 seconds). The recorded video runs at an average bit rate of 4 Mbps (which is 4,000,000 bits per second).

  • Multiply the three factors: 8 × 1800 × 4,000,000 = 57,600,000,000 bits

  • Convert that to gigabytes: 57.6 billion bits ÷ 8,000,000,000 bits per GB ≈ 7.2 GB

  • Add the baseline: 7.2 GB + 20 GB ≈ 27.2 GB

So, with these assumptions, you’d plan for roughly 27 GB of storage for the PSM server’s video recordings. Of course, this is a starting point. Real-world numbers shift as you tighten the bit rate, shorten or extend sessions, or handle a different mix of workloads. The beauty of the formula is that you can adjust one factor at a time and see how your capacity changes.

Putting the formula to work in your environment

If you’re sizing a PSM deployment (or revisiting an existing one), here’s a practical way to apply the calculation without drowning in data sheets:

  • Gather realistic numbers

  • Peak concurrent sessions: Look at your historical access patterns, incident response activity, or audit windows. If you don’t have a solid history, use intelligent estimates with a plan to monitor and adjust.

  • Average session length: Review past recordings or capture logs to estimate how long sessions typically last. Don’t forget to account for longer sessions during investigations or audits.

  • Average bit rate: Check your current recording settings. If you’ve configured 4 Mbps for high-clarity capture, that’s your starting point. If you’ve been experimenting with compression or resolution, factor in the changed rate.

  • Do a sanity check with a small test

Run a controllable, representative sample of sessions and record them at the chosen quality. Use the resulting data to verify whether the math aligns with actual storage consumption over a day or a week.

  • Build in headroom

The 20 GB baseline handles the things that aren’t captured in the three factors. Depending on your risk tolerance and growth expectations, you might add extra headroom for maintenance, metadata growth, or unusual spikes.

  • Monitor and tune

After deployment, track actual storage usage and compare it to your calculated figure. If you’re consistently undershooting, either increase capacity or reduce the bit rate, shorten session lengths where possible, or cap concurrent sessions during peak times. If you’re consistently overshooting, you know where to trim.

What affects the numbers (beyond the formula)

A few real-world knobs can shift your storage needs, sometimes more than you’d expect:

  • Video quality vs. retention needs

Higher fidelity makes for clearer forensics, but it taxes storage. If your policy requires retention of every nuance, keep the higher bit rate; if you can compromise on some frames or reduce color depth, you’ll save space.

  • Compression and codecs

The codec you choose matters. Modern codecs can deliver similar visual quality at lower bit rates. It’s worth testing a few options to find a sweet spot for your environment.

  • Session behavior during audits or investigations

If your security program has periods of elevated activity, you’ll see more concurrent sessions or longer durations. Plan for those windows so you aren’t caught short.

  • Overhead and metadata

The baseline 20 GB is more than a placeholder. OS overhead, log files, indexing, and metadata can creep up, especially in environments with vigorous auditing or complex policy frameworks.

  • Storage type and performance tier

Fast NVMe or SSD-backed storage can handle high read/write bursts more gracefully, but at a higher cost. If your PSM workload is bursty, a tiered approach with warm storage for older recordings can be a smart compromise.

  • Backups and disaster recovery

If you replicate or back up PSM recordings to another site or cloud storage, you’ll want to factor in the additional capacity and bandwidth those processes require. The formula gives you the primary storage need, but DR planning adds another layer.

Common-sense tips to keep you on track

  • Start with a believable baseline

If you’re new to sizing, pick conservative numbers and test. It’s easier to scale up later than to scramble for capacity after a spill of recordings.

  • Use a living sizing model

Treat the formula as a dynamic tool. As your environment evolves—more users, different recording standards, new retention policies—recalculate and adjust.

  • Document assumptions

Keep a short note of the numbers you used (concurrent sessions, session length, bit rate) and why you chose them. It saves confusion later when audits come around or when team members switch.

  • Align with governance and security needs

Your storage plan should reflect policy requirements for retention, auditing, and access controls. The more your plan aligns with governance, the smoother the operational side will run.

A note on realism and caveats

This formula is a practical guide, not a crystal ball. It’s built to translate real-world activity into a capacity estimate, but every environment carries its own quirks. Real-world testing, monitoring, and iterative tuning are your friends. And if you’ve got unusual workloads—like long investigative sessions, frequent screen captures during red-team exercises, or unusually high-resolution capture for forensics—that will tilt the numbers and you’ll want to adjust accordingly.

Bringing it all together

PSM storage sizing doesn’t have to be a mystery box. By focusing on three concrete inputs—maximum concurrent sessions, average session length, and average bit rate—you get a transparent, adaptable way to forecast storage needs. The +20 GB baseline is the sensible garnish that covers the rest: a little cushion for the operating system, metadata, and the occasional maintenance task.

If you’re planning or re-evaluating a PSM deployment, start with the formula, plug in your numbers, and run a small-scale test to validate. You’ll walk away with a clear, defendable capacity estimate and a practical roadmap for monitoring and adjustment as your environment evolves.

A final thought to keep the pace steady

Storage planning in security tooling is as much about disciplined measurement as it is about math. The numbers aren’t just digits on a spreadsheet—they translate into reliable access, faster investigations, and peace of mind for your team. And when you combine thoughtful sizing with ongoing observation, you’re not just provisioning space—you’re shaping a calmer, more controlled security posture for your organization.

Subscribe

Get the latest from Examzify

You can unsubscribe at any time. Read our privacy policy