Lesson 57 of 83 intermediate

Native/Platform APIs: Camera, Bluetooth, NFC, Sensors & Maps

Discussing hardware APIs confidently in senior Android interviews

Open interactive version (quiz + challenge)

Real-world analogy

Your Android phone is like a Swiss Army knife — Camera is the blade, Bluetooth is the connector cable, NFC is the tap-to-transfer button, sensors are the compass and spirit level, and Maps is the GPS. A senior developer does not need to be an expert in all of them, but must know which tool to grab and why.

What is it?

Android's platform APIs provide programmatic access to device hardware: CameraX for photo and video, Bluetooth for device pairing and data transfer, NFC for contactless communication, the sensor framework for motion and environment data, and Google Maps SDK for mapping and location. Senior developers must know the correct modern API for each, common permission pitfalls, and how to discuss each one intelligently in an architecture conversation.

Real-world relevance

A field inspection app uses CameraX to capture defect photos, BLE to read data from IoT sensors on equipment, NFC to identify equipment by tapping a tag, the accelerometer to detect phone orientation for AR overlays, and Maps to show where the equipment is located. All with proper runtime permission handling and graceful degradation when hardware is absent.

Key points

Code example

// CameraX — capture a photo
class CameraActivity : AppCompatActivity() {
    private lateinit var imageCapture: ImageCapture

    override fun onCreate(savedInstanceState: Bundle?) {
        super.onCreate(savedInstanceState)
        val cameraProviderFuture = ProcessCameraProvider.getInstance(this)
        cameraProviderFuture.addListener({
            val cameraProvider = cameraProviderFuture.get()

            val preview = Preview.Builder().build().also {
                it.setSurfaceProvider(viewBinding.viewFinder.surfaceProvider)
            }
            imageCapture = ImageCapture.Builder()
                .setCaptureMode(ImageCapture.CAPTURE_MODE_MINIMIZE_LATENCY)
                .build()

            cameraProvider.unbindAll()
            cameraProvider.bindToLifecycle(this, CameraSelector.DEFAULT_BACK_CAMERA, preview, imageCapture)
        }, ContextCompat.getMainExecutor(this))
    }

    fun takePhoto() {
        val photoFile = File(cacheDir, "photo_${System.currentTimeMillis()}.jpg")
        val outputOptions = ImageCapture.OutputFileOptions.Builder(photoFile).build()

        imageCapture.takePicture(outputOptions, ContextCompat.getMainExecutor(this),
            object : ImageCapture.OnImageSavedCallback {
                override fun onImageSaved(output: ImageCapture.OutputFileResults) {
                    // photo saved to photoFile
                }
                override fun onError(exc: ImageCaptureException) {
                    Log.e("Camera", "Capture failed: ${exc.message}", exc)
                }
            }
        )
    }
}

// BLE scan and connect (simplified)
class BleManager(private val context: Context) {
    private val bluetoothAdapter: BluetoothAdapter? =
        (context.getSystemService(Context.BLUETOOTH_SERVICE) as BluetoothManager).adapter

    fun scanForDevices(onDeviceFound: (BluetoothDevice) -> Unit) {
        val scanner = bluetoothAdapter?.bluetoothLeScanner ?: return
        val filter = ScanFilter.Builder()
            .setServiceUuid(ParcelUuid(SERVICE_UUID))
            .build()
        val settings = ScanSettings.Builder()
            .setScanMode(ScanSettings.SCAN_MODE_LOW_LATENCY)
            .build()

        scanner.startScan(listOf(filter), settings, object : ScanCallback() {
            override fun onScanResult(callbackType: Int, result: ScanResult) {
                onDeviceFound(result.device)
            }
        })
    }

    fun connect(device: BluetoothDevice, onDataReceived: (ByteArray) -> Unit): BluetoothGatt {
        return device.connectGatt(context, false, object : BluetoothGattCallback() {
            override fun onConnectionStateChange(gatt: BluetoothGatt, status: Int, newState: Int) {
                if (newState == BluetoothProfile.STATE_CONNECTED) gatt.discoverServices()
            }
            override fun onServicesDiscovered(gatt: BluetoothGatt, status: Int) {
                val char = gatt.getService(SERVICE_UUID)?.getCharacteristic(CHAR_UUID)
                gatt.setCharacteristicNotification(char, true)
            }
            override fun onCharacteristicChanged(gatt: BluetoothGatt, characteristic: BluetoothGattCharacteristic) {
                onDataReceived(characteristic.value)
            }
        })
    }

    companion object {
        val SERVICE_UUID: UUID = UUID.fromString("0000180d-0000-1000-8000-00805f9b34fb")
        val CHAR_UUID: UUID = UUID.fromString("00002a37-0000-1000-8000-00805f9b34fb")
    }
}

// Sensor — step counter
class StepCounterManager(context: Context) : SensorEventListener {
    private val sensorManager = context.getSystemService(Context.SENSOR_SERVICE) as SensorManager
    private val stepSensor = sensorManager.getDefaultSensor(Sensor.TYPE_STEP_COUNTER)
    var stepCount: Long = 0

    fun register() {
        sensorManager.registerListener(this, stepSensor, SensorManager.SENSOR_DELAY_NORMAL)
    }

    fun unregister() { sensorManager.unregisterListener(this) }

    override fun onSensorChanged(event: SensorEvent) {
        if (event.sensor.type == Sensor.TYPE_STEP_COUNTER) {
            stepCount = event.values[0].toLong()
        }
    }
    override fun onAccuracyChanged(sensor: Sensor, accuracy: Int) {}
}

// FusedLocationProviderClient
class LocationManager(private val context: Context) {
    private val fusedClient = LocationServices.getFusedLocationProviderClient(context)

    fun getCurrentLocation(onResult: (Location?) -> Unit) {
        fusedClient.getCurrentLocation(Priority.PRIORITY_HIGH_ACCURACY, null)
            .addOnSuccessListener { location -> onResult(location) }
    }
}

Line-by-line walkthrough

  1. 1. ProcessCameraProvider.getInstance() returns a ListenableFuture — the addListener callback runs on the main executor once the camera is ready.
  2. 2. bindToLifecycle() ties the camera session to the Activity lifecycle — CameraX automatically opens/closes the camera on resume/pause, preventing resource leaks.
  3. 3. ImageCapture.CAPTURE_MODE_MINIMIZE_LATENCY prioritizes speed over quality — use CAPTURE_MODE_MAXIMIZE_QUALITY for still photos where wait time is acceptable.
  4. 4. ImageCapture.takePicture() saves directly to a file via OutputFileOptions — more efficient than capturing to a Bitmap in memory for full-resolution photos.
  5. 5. BluetoothLeScanner.startScan with a ScanFilter targeting a specific ServiceUUID reduces battery usage — scanning without filters drains battery rapidly.
  6. 6. onServicesDiscovered fires after connectGatt triggers service discovery — this is the correct place to get Characteristic references, not in onConnectionStateChange.
  7. 7. setCharacteristicNotification enables client-side notification; you also typically need to write to the CCCD descriptor on the characteristic — omitting this is a common BLE bug.
  8. 8. SensorManager.registerListener in onResume and unregisterListener in onPause is the correct lifecycle pattern — sensors drain battery if left registered in background.
  9. 9. Sensor.TYPE_STEP_COUNTER returns total steps since reboot as a monotonically increasing float — you must save a baseline at session start and subtract to get session steps.
  10. 10. FusedLocationProviderClient.getCurrentLocation with PRIORITY_HIGH_ACCURACY uses GPS — for battery-sensitive apps, use PRIORITY_BALANCED_POWER_ACCURACY instead.

Spot the bug

class SensorActivity : AppCompatActivity(), SensorEventListener {
    private lateinit var sensorManager: SensorManager
    private var accelerometer: Sensor? = null

    override fun onCreate(savedInstanceState: Bundle?) {
        super.onCreate(savedInstanceState)
        sensorManager = getSystemService(Context.SENSOR_SERVICE) as SensorManager
        accelerometer = sensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER)
        sensorManager.registerListener(this, accelerometer, SensorManager.SENSOR_DELAY_FASTEST)
    }

    override fun onSensorChanged(event: SensorEvent) {
        val x = event.values[0]
        val y = event.values[1]
        val z = event.values[2]
        updateUI(x, y, z)
    }

    override fun onAccuracyChanged(sensor: Sensor, accuracy: Int) {}
}
Need a hint?
There are two bugs: one drains battery and crashes when the screen rotates; the other causes thread safety issues in UI updates.
Show answer
Bug 1: registerListener() is called in onCreate() but there is no corresponding unregisterListener() in onPause() or onStop(). The sensor stays registered when the Activity goes to the background (screen off, user switches apps), continuously draining battery and CPU. Fix: register in onResume(), unregister in onPause(). Bug 2: SENSOR_DELAY_FASTEST delivers sensor events at the maximum hardware rate (up to 200Hz on some devices). onSensorChanged() fires on a background thread at this rate, but updateUI() presumably updates Views. Touching UI from a background thread causes CalledFromWrongThreadException. Additionally, at 200Hz, you may flood the UI thread with updates. Fix: Use SENSOR_DELAY_NORMAL (5 updates/sec) or SENSOR_DELAY_GAME (50 updates/sec) for UI purposes, and post UI updates via runOnUiThread{} or a Handler tied to the main Looper.

Explain like I'm 5

Your phone has special superpowers built in. Camera is its eyes. Bluetooth is like a wireless walkie-talkie to talk to other gadgets nearby. NFC is like a magic touch — tap your phone on a sticker and it instantly knows what the sticker says. Sensors are like the phone's sense of balance and feel. Maps is like giving the phone a brain that knows every road in the world.

Fun fact

NFC tags can be as tiny as a grain of rice when implanted under the skin. There is a growing community of biohackers who implant NFC chips in their hands to unlock doors, store contact info, or even carry crypto wallet keys. Android's NFC API can read these the same way it reads a tap-to-pay card.

Hands-on challenge

Design a warehouse inventory app that uses: (1) NFC to identify bins by tapping a tag (read NDEF text record). (2) CameraX ImageAnalysis + ML Kit to scan barcodes on items. (3) BLE to connect to a weight sensor on a shelf. (4) Maps to show warehouse layout. For each feature, specify: the permission(s) required, the key API class used, and one gotcha that could cause a production bug. Then explain how you would gracefully degrade if any hardware is unavailable.

More resources

Open interactive version (quiz + challenge) ← Back to course: Android Interview Mastery