Actionable coding and design rules for building production-grade AR/VR applications with Unity/AR Foundation (C#) and companion XR stacks.
Building production-ready AR/VR apps means navigating a minefield of platform quirks, performance gotchas, and user comfort requirements that can derail even experienced developers. You're juggling frame rates, fighting memory allocations, and trying to ship something that actually works across different devices—all while keeping users comfortable and engaged.
Performance Hell: Your Unity build runs great in editor but drops frames on real devices. You're hitting GC spikes during interactions, your raycast calls are tanking CPU, and you can't figure out why your Quest app feels sluggish.
Platform Fragmentation: Code that works perfectly on ARKit fails mysteriously on ARCore. Your HoloLens features break when you port to Quest. Each platform has its own quirks and you're maintaining separate codebases.
User Comfort Disasters: Users are getting motion sick, your UI is unreachable, and tracking failures leave people confused. You're shipping apps that technically work but create terrible user experiences.
Development Chaos: No clear patterns for XR-specific challenges like anchor management, spatial state persistence, or graceful degradation when tracking fails. Your codebase becomes an unmaintainable mess of platform-specific workarounds.
These rules transform your XR development from chaotic experimentation into systematic craftsmanship. You get battle-tested patterns for Unity + AR Foundation, performance optimization techniques that actually work in production, and user comfort guidelines that prevent the common mistakes that kill XR apps.
Core Philosophy: Comfort first, performance-driven, fail gracefully. Every rule is designed around shipping production apps that users actually want to use.
void Update()
{
// This destroys performance
var hits = new List<ARRaycastHit>();
arRaycastManager.Raycast(Camera.main.ScreenToWorldPoint(Input.mousePosition), hits);
if (hits.Count > 0)
{
// Process every frame
PlaceObject(hits[0].pose);
}
}
Result: Frame drops, battery drain, terrible user experience.
[SerializeField] private List<ARRaycastHit> raycastHits = new List<ARRaycastHit>();
private void OnEnable()
{
touchInputManager.onTap += HandleTapInput;
}
private void HandleTapInput(Vector2 screenPosition)
{
if (arRaycastManager.Raycast(screenPosition, raycastHits, TrackableType.PlaneWithinPolygon))
{
PlaceObject(raycastHits[0].pose);
}
}
Result: Smooth 60+ FPS, responsive interactions, better battery life.
void PlaceObject(Pose pose)
{
var obj = Instantiate(prefab);
obj.transform.position = pose.position;
obj.transform.rotation = pose.rotation;
// Lost on tracking failure!
}
private async void PlaceObject(Pose pose)
{
var anchor = anchorManager.AddAnchor(pose);
if (anchor != null)
{
var obj = Instantiate(prefab, anchor.transform);
await SaveAnchorPersistently(anchor.trackableId);
}
}
Result: Objects persist across sessions, survive tracking loss, enable cloud sharing.
void Start()
{
// Assumes AR is supported
arSession.enabled = true;
depthManager.enabled = true; // Crashes on unsupported devices
}
private async void Start()
{
var availability = await ARSession.CheckAvailabilityAsync();
if (availability == SessionAvailability.Supported)
{
InitializeAR();
}
else
{
uiManager.ShowFallbackMode();
}
}
private void InitializeAR()
{
if (occlusionManager.descriptor?.supportsEnvironmentDepth == true)
{
occlusionManager.enabled = true;
}
}
Result: No crashes, clear user communication, professional polish.
Create the standardized folder hierarchy:
Assets/
XR/
Prefabs/ # Reusable XR components
Scripts/ # Core XR logic
Materials/ # XR-specific shaders/materials
Tests/ # Unit and integration tests
Set up namespace conventions:
namespace YourCompany.XR.Locomotion
{
public class TeleportController : MonoBehaviour { }
}
[CreateAssetMenu(fileName = "XRConfig", menuName = "XR/Configuration")]
public class XRConfiguration : ScriptableObject
{
[Header("Performance")]
public float maxAcceleration = 1.2f;
public float maxRotationSpeed = 30f;
[Header("Comfort")]
public bool enableTeleportation = true;
public bool enableSnapTurning = true;
}
public class XRManager : MonoBehaviour
{
[SerializeField] private XRConfiguration config;
private void Awake()
{
ValidatePlatformSupport();
ConfigureComfortSettings();
}
}
public class OptimizedARController : MonoBehaviour
{
private Camera arCamera;
private ARRaycastManager raycastManager;
private readonly List<ARRaycastHit> hits = new List<ARRaycastHit>(10);
private void Awake()
{
// Cache all references once
arCamera = Camera.main;
raycastManager = GetComponent<ARRaycastManager>();
}
// No Update() - use events instead
private void OnEnable()
{
ARSession.stateChanged += OnARSessionStateChanged;
}
}
[Test]
public void TestAnchorPersistence()
{
var mockPose = new Pose(Vector3.zero, Quaternion.identity);
var anchor = anchorManager.AddAnchor(mockPose);
Assert.IsNotNull(anchor);
Assert.AreEqual(mockPose.position, anchor.transform.position);
}
Set up device testing checklist:
These rules eliminate the guesswork from XR development. You get proven patterns that work in production, performance optimizations that actually move the needle, and user experience guidelines that keep people engaged instead of nauseated.
Stop fighting the platform—start shipping XR apps that users love.
You are an expert in: Unity (C#), AR Foundation, ARKit, ARCore, Mixed Reality Toolkit (MRTK), WebXR (8th Wall, Three.js), Unreal Engine (Blueprint/C++), Spatial Audio, Haptic SDKs.
## Key Principles
- Design for natural, intuitive interactions that map 1-to-1 with real-world gestures.
- Comfort first: keep acceleration < 1.2 m/s², rotation < 30 °/s, offer teleport/blink locomotion.
- Iterate with user-centered play-testing every sprint; deploy over-the-air (OTA) builds.
- Latency budget: end-to-end ≤ 20 ms. Prioritise 90 FPS (6-dof HMDs) or 60 FPS (mobile AR).
- Use spatial audio and haptic cues to reinforce visual feedback.
- Fail gracefully on unsupported hardware; surface clear onboarding and contextual prompts.
- Default to inclusive design: support seated, standing, colour-blind, single-handed play.
- Single source of truth: store session state in ScriptableObjects or dedicated XR managers.
## C# / Unity Language-Specific Rules
- File naming: `PascalCase.cs` matching the class. One public class per file.
- Namespace hierarchy: `Company.Product.Subsystem` (e.g., `Acme.XR.Locomotion`).
- MonoBehaviours:
• Avoid `Update()` when possible—prefer events (`OnEnable`, `ARSession.stateChanged`).
• If `Update()` is required, cache component/transform references in `Awake()`.
- Use `[SerializeField] private` fields + public read-only properties; expose only what the Inspector needs.
- ScriptableObjects for configuration (e.g., locomotion speeds, haptic patterns) to enable designer tweaking.
- Reduce GC allocations:
• Avoid LINQ in per-frame code.
• Reuse `List<T>` and `NativeArray` with `Allocator.Persistent`.
- `async/await` allowed for I/O, NEVER for per-frame logic; always catch `TaskCanceledException`.
- Use `using` blocks for temporary native resources (`CpuImage`) to auto-dispose.
- Prefabs live under `Assets/XR/Prefabs/`; runtime-instantiated content goes under an `XRContent` root.
## Error Handling & Validation
- Validate platform on start:
```csharp
if (!ARSession.CheckAvailability())
{
UI.ShowUnsupportedDevice();
return;
}
```
- Use early returns to avoid nested blocks.
- For AR Foundation getters follow Try* pattern:
```csharp
if (!cameraManager.TryAcquireLatestCpuImage(out var image))
return; // Graceful degradation
```
- Wrap native plugin calls in `try/catch` and route to a central `XRErrorHandler` that logs, notifies UI, and reports analytics.
- Never throw raw exceptions across frame boundaries; convert to user-friendly messages.
- Provide fallback modes (e.g., non-AR 3-D viewer) when tracking lost > 3 sec.
## Framework-Specific Rules
### Unity + AR Foundation
- Scene bootstrap: `ARSessionOrigin` + `ARSession` + `ARInputManager` mandatory.
- Plane/point-cloud visualisers must be toggleable (`Developer Mode`).
- Use `ARRaycastManager.Raycast(ScreenPoint, results, TrackableType.PlaneWithinPolygon)` only on discrete touch events.
- Anchor workflow:
• After placement, attach prefab to `ARAnchor` → persists across relocalisation.
• Persist anchors with `ARAnchorManager.AddAnchor` & `Guid` for cloud sharing.
- Occlusion:
• Enable `AROcclusionManager` only on devices that support it; query `descriptor.supportsEnvironmentDepth`.
- Asset delivery: Use Addressables; mark large models as `AssetBundle` with LZ4 for OTA patches.
- Input:
• XR Interaction Toolkit for controllers + hand tracking.
• Map common actions (`select`, `activate`, `uiPress`) via the new Input System.
### MRTK (HoloLens / Windows MR)
- Never modify default MRTK profiles; clone and extend.
- Use `Solver` components (e.g., `RadialView`, `Orbital`) rather than custom smoothing maths.
- Spatial mapping must run in **Observer Volume** with “Occlusion Only” mesh type for performance.
### WebXR / 8th Wall
- Target ≤ 50 k polys per mesh, ≤ 5 MB GLB per scene.
- Use `XRHitTest` once per tap, not per frame.
- Provide fallback CSS 2-D UI for non-XR browsers.
## Testing
- Unit tests with `Unity Test Runner` (EditMode) for math and data.
- PlayMode tests with `TestFixtures` + Mock HMD/hand poses.
- CI: run `-batchmode -nographics -runTests` in GitHub Actions, export `.xml` reports.
- Manual check-lists on physical devices: iOS (LiDAR + non-LiDAR), flagship & mid-range Android, Quest 2, HoloLens 2.
## Performance Optimisation
- Profile first: `Window > Analysis > Profiler` & GPU profiler (Quest: OVR Metrics).
- CPU: keep total main thread ≤ 10 ms; move heavy calc to `JobSystem + Burst`.
- GPU: use SRP Batcher, GPU Instancing, limit real-time lights to ≤ 2.
- Adaptive quality: scale render target (Foveated Rendering on Quest).
- Asset guidelines: textures max 2048², compress ASTC on mobile.
## Security & Privacy
- Request only `CAMERA`, `MOTION` permissions; no background recording.
- Immediately discard camera frames; use GPU path where possible to avoid RAM copies.
- Encrypt and anonymise any spatial mesh or cloud anchor data before upload.
- Comply with platform privacy manifests (iOS 17 Privacy Nutrition, Android 14 Data Safety).
## Accessibility
- Offer colour-blind safe palette (WCAG 2.1 AA).
- Provide speech or caption alternatives to spatial audio cues.
- Allow one-handed mode: reachable UI (< 60 % vertical height).
- Magnification gesture (pinch to zoom world UI) and configurable text size.
## Documentation & Folder Structure
```
Assets/
XR/
Prefabs/
Scripts/
Materials/
Tests/
Packages/
ProjectSettings/
```
- Each public script has XML summary + usage example.
- README per feature folder with setup steps, known issues.
## Continuous Deployment
- iOS/Android: Fastlane lanes for signing + TestFlight / Internal Test.
- Stand-alone (Quest): `ovr-platform-util upload` in pipeline.
- Auto-increment bundle version via commit hash.
## Analytics & Telemetry
- Track: session length, comfort events (motion-sick aborts), feature usage, FPS buckets.
- Anonymise user IDs; batch-upload on Wi-Fi only.
> Follow this rule set to produce maintainable, high-performance and user-friendly AR/VR experiences ready for production release.