14 KiB
AR Room Scanner & Furnishing — iOS-first (Unity 6.0 LTS)
Clean, modular Unity 6.0 LTS project focused on iOS (ARKit) for:
- Scanning a real room on-device (LiDAR Scene Reconstruction when available)
- Measuring in real-world meters
- Placing furniture with collisions and snapping
- Optional Object Detection (kept behind a compile flag)
- Optional RoomPlan integration path (native bridge) for semantic room models
This repo is designed to be stable today and easy to evolve (upgrade to 6.1/6.2 later).
Table of Contents
- Why iOS
- Folder Layout
- Assembly Definitions & Dependencies
- Packages & Versions
- Device Support Matrix
- Getting Started
- iOS Build & Xcode Setup
- Scenes & Flow
- Core Modules
- API Integration
- Configuration & Environments
- Version Control (Git, LFS, SmartMerge)
- Performance Guidelines (iOS/Metal)
- RoomPlan (future path)
- Troubleshooting
- Roadmap
- License
Why iOS
- Best on-device scanning via ARKit Scene Reconstruction on LiDAR devices (dense mesh, stable scale).
- Consistent tracking & depth across supported iPhones/iPads.
- Future option: RoomPlan (iOS 16+, LiDAR) to produce semantic, parametric rooms (walls/doors/windows) with accurate dimensions.
Folder Layout
Assets/
_Project/
App/ # app flow & UI
Controllers/
Scenes/
Bootstrap.unity
ScanScene.unity
FurnishScene.unity
UI/
ARRuntime/ # AR runtime features (platform-agnostic via AR Foundation)
Scanning/ # mesh collection, colliders, export hooks
Measurement/ # AB ruler, heights, helpers
Placement/ # raycasts, snapping, overlap checks, physics
Art/ # in-Unity assets
Logos/
Materials/
Models/
Prefabs/
Shaders/
Textures/
Domain/ # pure business/domain (no Unity deps)
Models/
Services/
Infra/ # outside world (API, storage, settings)
Api/ # IFurnitureApi + HttpFurnitureApi + DTOs
Persistence/ # OBJ/GLB export, JSON metadata
Settings/ # ScriptableObjects (ApiConfig, ProjectFeatures)
Detectors/
Null/ # default no-op detector
Lightship/ # stub; compiled only with LIGHTSHIP_ENABLED
Tests/ # EditMode/PlayMode tests
Settings/ # URP & project assets (keep!)
XR/ # added by packages
Keep Settings/ (URP pipeline assets & editor links).
A separate top-level Scans/ folder (outside Assets/) is recommended for large exports to avoid re-import churn.
Assembly Definitions & Dependencies
Create one .asmdef per root module:
Domain(no references)Infra→ referencesDomainARRuntime→ referencesDomainApp→ referencesARRuntime,Infra,DomainDetectors.Lightship→ referencesInfra,Domainwith define constraintLIGHTSHIP_ENABLEDTests→ references as needed
Dependency direction
App → (ARRuntime, Infra, Domain)
ARRuntime → Domain
Infra → Domain
Domain depends on nothing
This keeps compile times low and prevents “upward” coupling.
Packages & Versions
- Unity: 6.0 LTS (URP template)
- AR Foundation: 6.x
- ARKit XR Plugin: 6.x
- (Optional) XR Interaction Toolkit 3.x
- TextMesh Pro (built-in)
Pin package versions in
Packages/manifest.jsononce the project compiles cleanly.
Device Support Matrix
| Capability | Requirement | Notes |
|---|---|---|
| ARKit basic tracking | iPhone 8+ / iOS 13+ (practical: iOS 15+) | Non-LiDAR devices won’t produce dense meshes. |
| Scene Reconstruction (meshing) | LiDAR devices (e.g., iPhone 12 Pro+, 13 Pro+, 14 Pro/Pro Max, 15 Pro/Pro Max; iPad Pro 2020+) | AR Foundation exposes meshes via ARMeshManager. |
| Environment Depth / People Occlusion | Device-dependent | Used for occlusion realism; works best on LiDAR. |
| RoomPlan (future) | iOS 16+, LiDAR | Generates semantic room model via native SDK. |
We gracefully fall back: if no LiDAR mesh is available, we still support plane detection + measurements.
Getting Started
-
Switch to iOS
File → Build Settings → iOS → Switch Platform. -
Install packages (Window → Package Manager)
- AR Foundation 6.x
- ARKit XR Plugin 6.x
- (Optional) XR Interaction Toolkit 3.x
-
Project Settings (iOS)
- Player → iOS
- Scripting Backend: IL2CPP
- Target Architectures: ARM64
- Graphics APIs: Metal (only)
- Minimum iOS Version: 15+ recommended (RoomPlan needs 16+)
- Camera Usage Description (Info.plist): e.g., “AR scanning requires camera access.”
- Photo Library Add Usage Description (if you export files to Photos)
- Motion Usage Description (only if you use CoreMotion; otherwise omit)
- XR Plug-in Management → iOS
- Enable ARKit
- In ARKit settings, enable Environment Depth (and People Occlusion if needed)
- URP
- SRP Batcher: ON
- MSAA: 2x
- Mobile-friendly shadows
- Player → iOS
-
Scenes
- Create
Bootstrap.unity,ScanScene.unity,FurnishScene.unityunderAssets/_Project/App/Scenes/.
- Create
-
First run
- Add
BootstrapandScanSceneto Build Settings (Bootstrap first). - Build to Xcode, set signing, run on device.
- Add
iOS Build & Xcode Setup
- Build in Unity → generates an Xcode project.
- Xcode
- Select your Team & Provisioning profile
- Ensure Camera privacy string is present (Unity will add based on Player Settings)
- In “Signing & Capabilities”, you typically don’t need extra entitlements for ARKit beyond camera; add Files (iCloud) only if you export to Files app.
- Run on device (USB).
- If you see a black camera feed: check Privacy strings, ensure real device (not Simulator), and that
Requires ARKitis set (Unity: Player → iOS → “Requires ARKit”).
Scenes & Flow
Scene Descriptions (TL;DR)
- Bootstrap.unity — Minimal entry scene. Initializes global settings (URP assets,
ProjectFeatures,ApiConfig), handles one-time bootstrapping, and programmatically loadsScanScene. No AR logic or heavy UI here. - ScanScene.unity — Scanning & measuring. Contains AR Session, XR Origin, AR Plane/Raycast/Mesh Managers, and the camera with AR Camera Manager + AR Occlusion. Merges AR mesh chunks into a single mesh (RoomScanner), enables A→B measurements in meters (MeasureTool), and can optionally export the room mesh.
- FurnishScene.unity — Furniture placement. Fetches items from the backend via
IFurnitureApi, performs raycasts to floor/walls with snapping, runs overlap/collision checks before placement, uses occlusion for realism, and hosts the move/rotate/align UX.
Bootstrap.unity
Minimal loader that switches to the first “real” scene:
using UnityEngine;
using UnityEngine.SceneManagement;
public class Bootstrap : MonoBehaviour
{
[SerializeField] string firstScene = "ScanScene";
void Start() => SceneManager.LoadScene(firstScene, LoadSceneMode.Single);
}
ScanScene.unity (first milestone)
Hierarchy (suggested)
AR Session
XR Origin
└─ Camera Offset
└─ Main Camera (tag: MainCamera)
AR Managers (child of XR Origin)
├─ AR Plane Manager
├─ AR Raycast Manager
└─ AR Mesh Manager # must be under XR Origin in ARF 6
RoomMesh (MeshFilter + MeshRenderer + MeshCollider)
RoomScanner (script) # combines AR chunks into RoomMesh
MeasureLine (LineRenderer)
MeasureTool (script) # tap A→B distances in meters
On the Main Camera (child of XR Origin): add AR Camera Manager and AR Occlusion Manager.
FurnishScene.unity (next milestone)
- Placement raycasts and snapping to floor/walls
- Overlap checks / colliders to prevent interpenetration
- Occlusion enabled for realism
Core Modules
ARRuntime/Scanning — RoomScanner
- Polls
ARMeshManager.meshesand combines all chunks into a single mesh every N seconds. - Assigns that mesh to
RoomMesh’sMeshFilterandMeshCollider. - The combined mesh is in meters (Unity units = meters), so measuring is straightforward.
ARRuntime/Measurement — MeasureTool
- Touch once = point A, touch again = point B.
- Draws a line (
LineRenderer) and shows meters to 2 decimals. - Uses
Physics.Raycastagainst the combined collider or against detected planes.
ARRuntime/Placement
- Raycast from screen to floor/wall planes or to the combined mesh.
- Snap by projecting onto a plane’s normal.
- Prevent collisions with
Physics.OverlapBox/Spherebefore placing.
API Integration
Interface-first, so the app logic doesn’t depend on a concrete client:
// Infra/Api/IFurnitureApi.cs
public interface IFurnitureApi
{
Task<IReadOnlyList<Furniture>> GetVariantsAsync(IEnumerable<string> ids);
Task<IReadOnlyList<Furniture>> SearchAsync(string query, int page = 1, int pageSize = 20);
}
// Infra/Api/HttpFurnitureApi.cs
public sealed class HttpFurnitureApi : IFurnitureApi
{
private readonly HttpClient _http;
private readonly ApiConfig _cfg;
public HttpFurnitureApi(HttpClient http, ApiConfig cfg) { _http = http; _cfg = cfg; }
public async Task<IReadOnlyList<Furniture>> GetVariantsAsync(IEnumerable<string> ids)
{
var url = $"{_cfg.BaseUrl}/api/v1/FurnitureVariant/GetByIds";
using var resp = await _http.PostAsJsonAsync(url, ids);
resp.EnsureSuccessStatusCode();
return await resp.Content.ReadFromJsonAsync<List<Furniture>>() ?? new();
}
public async Task<IReadOnlyList<Furniture>> SearchAsync(string query, int page = 1, int pageSize = 20)
{
var url = $"{_cfg.BaseUrl}/api/v1/FurnitureVariant/Search?query={Uri.EscapeDataString(query)}&page={page}&pageSize={pageSize}";
return await _http.GetFromJsonAsync<List<Furniture>>(url) ?? new();
}
}
Config: Infra/Settings/ApiConfig (ScriptableObject) with BaseUrl, timeouts, and environment selectors (DEV/QA/PROD).
Configuration & Environments
Create a ProjectFeatures ScriptableObject (in Infra/Settings/) with toggles:
useMeshing(on by default)useOcclusion(environment/people)useObjectDetection(off; Lightship behindLIGHTSHIP_ENABLED)enableExports(to write OBJ/GLB to app storage)
This lets QA test different combinations without code changes.
Version Control (Git, LFS, SmartMerge)
- Keep text/YAML in Git (scenes, prefabs, materials,
.meta,ProjectSettings/,Packages/). - Track large binaries (GLB/OBJ/FBX/PSDs/EXR/8K textures) with Git LFS.
- Enable Unity Smart Merge for clean scene/prefab merges.
(We also recommend a top-level Scans/ folder, LFS-tracked, for big room exports.)
Performance Guidelines (iOS/Metal)
- Metal only, IL2CPP, ARM64.
- URP: SRP Batcher ON, MSAA 2×, mobile shadows.
- Avoid per-frame allocations; reuse buffers.
- Combine mesh at intervals (1–2 s) rather than every frame.
- Update
MeshCollider.sharedMeshonly when merged mesh changes to avoid spikes. - Consider decimation for very large meshes if triangle count exceeds target thresholds.
RoomPlan (future path)
If you need clean, semantic floorplans and furniture categories:
- Implement a native iOS plugin (Swift/Obj-C) that runs RoomPlan (iOS 16+, LiDAR).
- Export USDZ/USDA/OBJ/GLTF or RoomPlan JSON; import into Unity via
Infra/Persistence. - Provide an adapter
IRoomImportersoAppcan switch between Scene Reconstruction mesh and RoomPlan semantic model at runtime/build time.
Keep all RoomPlan code behind a ROOMPLAN_ENABLED define if you prefer the same pattern as Lightship.
Troubleshooting
-
Popup: “An ARMeshManager must be a child of an XROrigin.”
MoveARMeshManagerunder XR Origin (not on the same GO). -
Black camera feed
Real device only (no Simulator), Camera usage string present,Requires ARKitticked, provisioning OK. -
No mesh appears
Device may not have LiDAR; fall back to planes & depth. Ensure ARKit Scene Reconstruction is supported on the test device. -
Raycast doesn’t hit
EnsureRoomMeshhas a MeshCollider and that you merged at least once. Check layers. -
Build fails in Xcode
Clear Derived Data, check signing, ensure Metal is the only graphics API.
Roadmap
- ✅ iOS-first scanning (Scene Reconstruction), measuring, placement skeleton
- ⏳ Exporters (OBJ/GLB + Draco), thumbnails, metadata (
units,bbox, triangle count) - ⏳ Furniture placement UX (snapping gizmos, grid, rotation/align to wall)
- ⏳ Semantic planes (wall/floor/ceiling) classification helpers
- ⏳ RoomPlan native bridge (optional feature flag)
- ⏳ Lightship detection re-enable (compile flag)
License
TBD — choose a license that suits your distribution model (MIT/Apache-2.0/Proprietary).
Quick Start (TL;DR)
- Open in Unity 6.0 LTS, switch to iOS.
- Install AR Foundation 6.x + ARKit XR Plugin 6.x.
- Player: IL2CPP / ARM64 / Metal, iOS 15+, Camera privacy string.
- XR Plug-in Management: ARKit ON, enable Environment Depth.
- Open ScanScene → Run on a LiDAR device → tap two points to measure in meters.
- Move to FurnishScene for placement once scanning feels good.