Cloud Device Farms
Why Cloud Device Farms Are Essential
Real-device testing at scale requires cloud infrastructure. No team can maintain a lab of 200+ devices across 15+ OS versions, keep them charged, updated, and connected. Cloud device farms provide on-demand access to thousands of real devices, eliminating the physical device management burden.
Platform Comparison
| Feature | BrowserStack | Sauce Labs | AWS Device Farm | Firebase Test Lab |
|---|---|---|---|---|
| Real devices | 3,000+ | 2,000+ | Curated set | Google devices + popular OEMs |
| Emulators/simulators | Yes | Yes | No (real only) | Yes |
| Parallel execution | Unlimited | Plan-dependent | 5-50 concurrent | Plan-dependent |
| Appium support | Full | Full | Full | Partial |
| Playwright/Selenium | Full | Full | Not for mobile web | No |
| Video recording | Yes | Yes | Yes | Yes |
| Network throttling | Yes | Yes | Limited | Limited |
| Geolocation | Yes | Yes | No | No |
| CI integration | All major | All major | AWS CodePipeline | Firebase CLI |
| Pricing model | Per-user/month | Per-user/month | Per-device-minute | Free tier + per-use |
| Best for | Broad coverage, web + mobile | Enterprise, compliance | AWS-native teams | Android-first teams |
Choosing a Platform
| If Your Team... | Choose | Why |
|---|---|---|
| Needs both web and mobile testing | BrowserStack or Sauce Labs | Unified platform for both |
| Is AWS-native and budget-conscious | AWS Device Farm | Pay-per-minute, no subscription |
| Primarily builds Android apps | Firebase Test Lab | Free tier, deep Android integration |
| Has enterprise compliance requirements | Sauce Labs | SOC 2 Type 2, HIPAA available |
| Needs maximum device diversity | BrowserStack | Largest real device catalog |
BrowserStack Integration Example
GitHub Actions Workflow
# .github/workflows/mobile-tests.yml
name: Mobile E2E Tests
on:
push:
branches: [main]
jobs:
mobile-tests:
runs-on: ubuntu-latest
strategy:
matrix:
include:
- device: "iPhone 15"
os_version: "18"
platform: "ios"
- device: "Samsung Galaxy S24"
os_version: "14.0"
platform: "android"
- device: "Google Pixel 8"
os_version: "14.0"
platform: "android"
steps:
- uses: actions/checkout@v4
- name: Upload app to BrowserStack
run: |
RESPONSE=$(curl -u "$BS_USER:$BS_KEY" \
-X POST "https://api-cloud.browserstack.com/app-automate/upload" \
-F "file=@./builds/app-release.apk")
echo "APP_URL=$(echo $RESPONSE | jq -r '.app_url')" >> $GITHUB_ENV
- name: Run Appium tests
env:
BROWSERSTACK_USER: ${{ secrets.BS_USER }}
BROWSERSTACK_KEY: ${{ secrets.BS_KEY }}
DEVICE: ${{ matrix.device }}
OS_VERSION: ${{ matrix.os_version }}
run: |
pytest tests/mobile/ \
--device="$DEVICE" \
--os-version="$OS_VERSION" \
--app-url="$APP_URL" \
--junitxml=results-${{ matrix.device }}.xml
BrowserStack Capabilities Configuration
# conftest.py -- BrowserStack configuration
import os
from appium import webdriver
from appium.options.android import UiAutomator2Options
import pytest
@pytest.fixture
def bs_driver(request):
"""Create a BrowserStack Appium driver."""
device = os.environ.get("DEVICE", "Samsung Galaxy S24")
os_version = os.environ.get("OS_VERSION", "14.0")
app_url = os.environ.get("APP_URL")
options = UiAutomator2Options()
options.set_capability("platformName", "Android")
options.set_capability("appium:deviceName", device)
options.set_capability("appium:platformVersion", os_version)
options.set_capability("appium:app", app_url)
# BrowserStack-specific capabilities
options.set_capability("bstack:options", {
"userName": os.environ["BROWSERSTACK_USER"],
"accessKey": os.environ["BROWSERSTACK_KEY"],
"projectName": "MyApp Mobile Tests",
"buildName": f"Build-{os.environ.get('GITHUB_SHA', 'local')[:8]}",
"sessionName": request.node.name,
"debug": True,
"networkLogs": True,
"video": True,
"appiumVersion": "2.6.0",
"idleTimeout": 300,
})
driver = webdriver.Remote(
command_executor="https://hub-cloud.browserstack.com/wd/hub",
options=options,
)
yield driver
# Mark test status in BrowserStack
status = "passed" if request.node.rep_call.passed else "failed"
driver.execute_script(
f"browserstack_executor: {{\"action\": \"setSessionStatus\", "
f"\"arguments\": {{\"status\": \"{status}\"}}}}"
)
driver.quit()
Firebase Test Lab for Android
Firebase Test Lab offers a generous free tier and deep integration with the Android ecosystem:
# Run Appium tests on Firebase Test Lab
gcloud firebase test android run \
--type instrumentation \
--app builds/app-release.apk \
--test builds/app-test.apk \
--device model=Pixel7,version=34 \
--device model=oriole,version=33 \
--timeout 10m \
--results-bucket gs://my-test-results \
--results-dir "run-$(date +%Y%m%d-%H%M%S)"
# Run a robo test (automatic exploration)
gcloud firebase test android run \
--type robo \
--app builds/app-release.apk \
--device model=Pixel7,version=34 \
--timeout 5m \
--robo-directives "text:username_field=test@example.com,text:password_field=pass123"
Firebase Robo Tests
Firebase's Robo test automatically crawls your app, tapping buttons and filling forms to discover crashes. This is useful as a smoke test because it requires zero test code:
# .github/workflows/firebase-robo.yml
name: Firebase Robo Test
on:
push:
branches: [main]
jobs:
robo-test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: google-github-actions/auth@v2
with:
credentials_json: ${{ secrets.GCP_CREDENTIALS }}
- uses: google-github-actions/setup-gcloud@v2
- name: Run Robo test
run: |
gcloud firebase test android run \
--type robo \
--app builds/app-release.apk \
--device model=Pixel7,version=34 \
--device model=a]54x,version=34 \
--timeout 5m
- name: Check for crashes
run: |
RESULTS=$(gcloud firebase test android results describe --format json)
CRASHES=$(echo "$RESULTS" | jq '.testExecutions[].testResults.crashCount')
if [ "$CRASHES" != "0" ]; then
echo "Robo test found crashes!"
exit 1
fi
AWS Device Farm
AWS Device Farm is pay-per-minute with no subscription, making it cost-effective for teams that run tests less frequently:
# Run tests on AWS Device Farm
import boto3
def create_test_run(project_arn, app_arn, test_package_arn):
client = boto3.client('devicefarm', region_name='us-west-2')
# Create a device pool (or use a curated one)
run = client.schedule_run(
projectArn=project_arn,
appArn=app_arn,
test={
'type': 'APPIUM_PYTHON',
'testPackageArn': test_package_arn,
},
devicePoolArn='arn:aws:devicefarm:us-west-2::devicepool:TOP_DEVICES',
configuration={
'jobTimeoutMinutes': 30,
},
)
return run['run']['arn']
Best Practices for Cloud Device Farms
Parallelize aggressively -- Cloud farms charge per device-minute. Running 10 devices in parallel for 5 minutes costs the same as 1 device for 50 minutes but finishes 10x faster.
Use smart test distribution -- Not all tests need to run on all devices. Run full regression on Tier 1 devices, critical paths on Tier 2, and smoke tests on Tier 3.
Cache app uploads -- Upload your app binary once per build, then reference it across all device runs.
Enable video recording for failures only -- Video recording adds overhead. Enable it for debugging failed runs, not for every successful test.
Set idle timeouts -- Cloud devices are billed while idle. Set aggressive timeouts (3-5 minutes) to prevent zombie sessions.
Tag builds for traceability -- Include git SHA, branch name, and PR number in build tags so you can correlate test results with code changes.
build_name = f"{os.environ.get('GITHUB_SHA', 'local')[:8]}-" \
f"{os.environ.get('GITHUB_REF_NAME', 'dev')}"
Cloud device farms are the only practical way to achieve broad device coverage for mobile applications. The upfront investment in configuration pays for itself in testing confidence and bug prevention.