Author

rrsaikat_me

Browsing
115 Views

RNetwork is a light weight and lifecycle awared live internet connection status library based on rxnetwork and crouton.

How does another developer add this as a dependency?

STEP 1: Reference your Bintray repository into project-level build.gradle:

    allprojects {
      repositories {
        // ...
        maven { url 'https://dl.bintray.com/rrsaikat/RNetwork' }
      }
    }

STEP 2: Reference the library itself in your module-level build.gradle:

    implementation 'com.rezwan.knetworklib:knetworklib:1.0.3'

STEP 3: KNetwork.initialize(this) – must declare this into Application.

    class App:Application() {
        override fun onCreate() {
            super.onCreate()
            KNetwork.initialize(this)
        }
    }

STEP 4: KNetwork.bind(this, lifecycle) – bind the targeted activity in which you want to show network status.

    KNetwork.bind(this, lifecycle)
                   .setConnectivityListener(this)

Available additional methods:

*  showKNDialog() - set true for show dialog when net connection goes off.
*  setConnectivityListener() - connected, disconnected callback into activity
*  setInAnimation() - custom animation setup
*  setOutAnimation() - custom animation setup
*  setViewGroupResId() - targeted viewgroup to show network status views.

Here is the simple demonstration of this library:

Video Tutorial (Bonus) :

Explore and download the full project from Github

Follow me at :

github profile of rrsaikat
Find me at facebook
355 Views

In this article, we will dive into the world of Android AR i.e. Augmented Reality, specifically ARCore, Google’s platform for building AR experiences. We will see how ARCore is transforming AR application development by abstracting out complex matrix & vector math and providing us with beautiful APIs for AR development.

To start with, let’s take a look at what augmented reality is and why we, as developers, should be really excited for this new tech!

What is Augmented Reality?

According to the dictionary definition, augmented reality is “a technology that superimposes a computer-generated image on a user’s view of the real world, thus providing a composite view”.

Essentially, AR is a technology which enables us to render computer generated 3D object models into the real world and have it interact with its surrounding as if it were physically present at the same location.

This technology has vast applications in the areas of:

  • Education: Imagine having a 3D model of a human brain on your desk.
  • Tourism: To place 3D models of popular monuments in the physical world.
  • Furniture Retail: A guide to check how the chair would look like in your living room before making a purchase.
  • E-Commerce: Checking out your new outfit in 3D right in front of you.
  • Medicine and Healthcare: Having a 3D model of various proteins present in a drug right inside the chemistry lab.
  • And many more…

A few years ago, developing AR applications meant learning OpenGL, and complex vector math. In 2018, Google released ARCore along with Sceneform SDK (for android) in order to make AR development easier for everyone. So, let’s have a look at what ARCore has to offer.

What is ARCore?

According to the definition provided by Google, ARCore is a platform for building Android AR experiences. It enables your phone to sense its environment, understand the world and interact with the information.

ARCore works on 3 principles:

  • Motion Tracking: It allows the phone to understand its current position relative to the real world.
  • Understanding the Environment: It allows the phone to detect the size and location of all type of surfaces: vertical, horizontal and angled.
  • Light Estimation: It allows the phone to sense the environment’s lighting condition.

As the user moves his/ her phone in the real world, ARCore is able to understand its surroundings and emulate the real world digitally, in which it can place objects. Motion tracking helps ARCore to identify features which allows it to keep a track of its location in relation to the real environment.

As of now, ARCore is available for:

  • Java (Android)
  • Unity (iOS and Android)
  • Unreal Engine
  • iOS

This list covers most of the devices and development platform for AR application development.

Sceneform

ARCore in itself isn’t an SDK, rather it is an engine that helps SDKs to render the objects. Hence, in order to make use of this functionality, Google released Sceneform SDK to enable developers to build Android AR apps without having to learn OpenGL.

Sceneform comes with many nifty features such as:

  • An automatic compatibility check for ARCore enabled phones.
  • Checking for camera permissions.
  • A scene graph API to abstract all the complexities.
  • A plugin for manipulating 3D assets.

We will now delve deep into building a sample Android AR application using Sceneform. This will help you understand Sceneform and ARCore in much more in depth.

Sceneform provides the high level API for rendering 3D models using Java. This helps make creating AR experiences easier.

In this tutorial i will introduce some of the basic walkthrough of Sceneform API :

  • Adding the Sceneform fragment to an Android application which handles ARCore session creation, requesting camera permission and storage permission, as well as provide common UX elements.
  • Importing 3D models into your Android Studio project.
  • Placing multiple objects in the scene using a node based scene graph.
  • Handling gestures for placing and moving objects in the AR scene.
  • Taking a picture of the AR experience.
  • Record full screen video.

Prerequisites

Make sure you have these before starting our project:

  • Android Studio 3.1 or greater
  • Androidx and Kotlin support
  • ARCore compatible device and USB cable
  • Access to the internet to download dependencies while building the app

Later on we’ll copy some 3D assets for the codelab from the sample project on GitHub. You can download these sample assets for the project. The zip file also includes the completed project for reference.

For more information about getting started, see the Sceneform documentation.

See Enable developer options and debugging for more details on how to enable developer options for your device.

Now that you have everything you need, let’s get started!

Overview

Here’s how the completed app would look like:

Now follow me step by step:

1.Create new project

In Android Studio, create a new project targeting API level 24 (Android 7.) or later:

Phone and Tablet API 24 7.0 (Nougat)

2. Configure Project

Configure project like this and then click on finish. Wait until gradle build is completed!

Now our initial project will look like this

3. Adding the Sceneform Plugin

You will need to install the Sceneform Plugin to android studio. Sceneform plugin will help you with tasks such as importing a model into your Android project.

In order to install the plugin, follow the steps given below:

  • For Windows users: Go to: File-> Settings-> Plugins
  • For macOS users: Go to: Android Studio-> Preferences-> Plugins
  • Now enter “Sceneform” in the search bar. It will be at the top named Google Sceneform Tools.
  • Install the plugin and restart android studio.

4. Adding Dependencies

Add the following dependencies , update language level to JAVA 8 and apply sceneform plugin to your app level build.gradle file:

Important Note: Sceneform SDK requires minSdkVersion greater than or equal to 24. So, make sure that you set the minSdkVersion>= 24. Also, make sure that you have included the Maven repository in your project level build.gradle.

android {
    compileOptions {
        sourceCompatibility JavaVersion.VERSION_1_8
        targetCompatibility JavaVersion.VERSION_1_8
    }
}
dependencies {
    implementation 'com.google.ar:core:1.12.0'
    implementation "com.google.ar.sceneform.ux:sceneform-ux:1.12.0"
    implementation 'com.google.android.material:material:1.0.0'
}

apply plugin: 'com.google.ar.sceneform.plugin'

Now go to your project level build.gradle file, add this classpath and click Sync project.

classpath 'com.google.ar.sceneform:plugin:1.12.0'

5. Updating Manifest

Add the following lines in your AndroidManifest.xml file:

<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-feature android:name="android.hardware.camera.ar" android:required="true" />

Using ARCore requires camera permission and a camera enabled phone (obviously).

Also, add a meta data to your application tag:

<meta-data android:name="com.google.ar.core" android:value="required" />

Note: If your app strictly requires the device to be ARCore enabled, then set required = true or if AR is not a primary feature or you have handled compatibility for non-compatible devices you can set required = false.

6. Adding the ArFragment

With all the initial set up done, it is now time to start by adding a ArFragment (provided in the Sceneform SDK) into our app. ArFragment automatically handles your sessions and the runtime checks necessary for the application to work.

If ARCore has not been installed on the user’s device, ArFragment urges the user to install ARCore. Also, if camera permission is not granted, it asks for camera permission as well. Hence, ArFragment is the best way to start building your very first Android ARCore application.

But if your app still needs some extended functionality, you can always subclass ArFragment and create a Fragment of your own to support your custom features.

activity_main.xml
<?xml version="1.0" encoding="utf-8"?>
<androidx.constraintlayout.widget.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:app="http://schemas.android.com/apk/res-auto"
    xmlns:tools="http://schemas.android.com/tools"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    tools:context=".MainActivity">


    <fragment
        android:id="@+id/ux_fragment"
        android:name="com.google.ar.sceneform.ux.ArFragment"
        android:layout_width="0dp"
        android:layout_height="0dp"
        app:layout_constraintBottom_toTopOf="@+id/textView"
        app:layout_constraintEnd_toEndOf="parent"
        app:layout_constraintStart_toStartOf="parent"
        app:layout_constraintTop_toTopOf="parent" />

    <com.google.android.material.floatingactionbutton.FloatingActionButton
        android:id="@+id/showObject"
        android:layout_width="wrap_content"
        android:layout_height="wrap_content"
        android:layout_margin="10dp"
        android:src="@drawable/ic_eye"
        android:backgroundTint="#0083A9"
        app:layout_constraintBottom_toBottomOf="@+id/ux_fragment"
        app:layout_constraintEnd_toEndOf="parent" />


    <TextView
        android:id="@+id/textView"
        android:layout_width="match_parent"
        android:layout_height="wrap_content"
        android:background="#0083A9"
        android:gravity="center"
        android:padding="10dp"
        android:text="Great! Use one finger to move and two fingers to rotate"
        android:textColor="#ffffff"
        app:layout_constraintBottom_toBottomOf="parent" />

</androidx.constraintlayout.widget.ConstraintLayout>

Here’s how your layout activity_main.xml file would look like:

7. Checking compatibility at runtime

We will check if the device:

  1. Is running Android API version >= 24.
  2. Can support OpenGL version 3.0.

The above conditions are mandatory for a device to support AR applications using ARCore and Sceneform SDK.

We intend to finish the activity if these conditions aren’t satisfied. However, you can still continue to support other features.

fun checkIsSupportedDeviceOrFinish(activity: Activity): Boolean {
    if (Build.VERSION.SDK_INT < Build.VERSION_CODES.N) {
        Log.e(TAG, "Sceneform requires Android N or later")
        Toast.makeText(activity, "Sceneform requires Android N or later", Toast.LENGTH_LONG)
            .show()
        activity.finish()
        return false
    }
    val openGlVersionString =
        (activity.getSystemService(Context.ACTIVITY_SERVICE) as ActivityManager)
            .deviceConfigurationInfo
            .glEsVersion
    if (java.lang.Double.parseDouble(openGlVersionString) < MIN_OPENGL_VERSION) {
        Log.e(TAG, "Sceneform requires OpenGL ES 3.0 later")
        Toast.makeText(activity, "Sceneform requires OpenGL ES 3.0 or later", Toast.LENGTH_LONG)
            .show()
        activity.finish()
        return false
    }
    return true
}
Now run the app and our project will look like this :

8. Adding 3D models to our application

It is now time to download and import the 3D models to be rendered into our application. In our case, we will be rendering a 3D Cooker in a corner of our room and moving it around.

Download the 3D model from here

After your model finishes downloading, you will need to extract the downloaded zip file into a sample data folder.

You will find a .mtl file, a .obj file and a png image of the model. We’ll import the .obj file in our application using the sceneform plugin.

9. Importing the model using Sceneform plugin

Open the project view in your android studio project and expand the app folder. You will notice a folder named “sampledata”. If not, go ahead to create one and put your model folder into it.

Right click on the .obj file and you will find an option saying “Import Sceneform Asset”. Click on it and leave the settings to default. After you finish importing, the gradle would sync the project to include the asset in your application.

With this, you are done importing the 3D asset into your application. Now it’s time to write some code to include the model into the AR scene.

10. Building the Model

Add the following code in your java file:

activity_main.java

 override fun onCreate(savedInstanceState: Bundle?) {
    super.onCreate(savedInstanceState)
    if (!checkIsSupportedDeviceOrFinish(this)) {
        return;
    }

    setContentView(R.layout.activity_main)
    val arFragment = supportFragmentManager.findFragmentById(R.id.ux_fragment) as ArFragment?

    arFragment?.setOnTapArPlaneListener { hitResult: HitResult, plane: Plane, motionEvent: MotionEvent ->
        val anchor = hitResult.createAnchor()
        placeObject(arFragment, anchor, Uri.parse("saucepan.sfb"))

    }
}

Let’s see what’s happening here.

  1. First, we get the fragment we added in our layout file with the help of supportFragmentManager and the fragment id.
  2. Then we need to load the model into the scene. For this, we use the ModelRenderable class provided by the Sceneform SDK. With the help of ModelRenderable’s setSource() method, we can load our model by passing in the name of the generated .sfb file.
  3. Model is being built on a background thread, so after the model is loaded, it’s presented to the main thread which then renders it to the scene.
  4. We receive the model inside the thenAccept method. If there’s any error in building the model, an exception is thrown.

Our model is loaded, now let’s place it into the scene.

11. Adding the Model to the AR Scene

Our AR fragment is the container of the scene and hence we need to add a model to the fragment whenever it is clicked. Hence, we’ll add an onTapListener to our fragment.

Using the hitResult, we can get the location tapped and create an anchor node which is the root node of our scene (image an augmented reality scene as an inverted tree).

Next, we create a TransformableNode which will be our chair and set it to the anchornode. A transformable node can react to location changes and size changes when the user drags the object or uses pinch to zoom.

Let’s have a look at some terminologies here:

  • Scene: It’s the place where our 3D world will be rendered.
  • HitResult: It is an imaginary ray of light coming from infinity and it’s first point of intersection with the real world is the point of tap.
  • Anchor: A fixed location in the real world. Used to transform local coordinates (according to user’s display) to the real-world coordinates.
  • TransformableNode: A node that can react to user’s interactions such as rotation, zoom and drag.

Here’s how your final java file would look like:

private fun placeObject(arFragment: ArFragment, anchor: Anchor, uri: Uri) {
    ModelRenderable.builder()
        .setSource(arFragment.context, uri)
        .build()
        .thenAccept({ modelRenderable -> addNodeToScene(arFragment, anchor, modelRenderable) })
        .exceptionally { throwable ->
            Toast.makeText(arFragment.context, "Error:" + throwable.message, Toast.LENGTH_LONG)
                .show()
            null
        }

}

private fun addNodeToScene(arFragment: ArFragment, anchor: Anchor, renderable: Renderable) {
    val anchorNode = AnchorNode(anchor)
    val node = TransformableNode(arFragment.transformationSystem)
    node.renderable = renderable
    node.setParent(anchorNode)
    arFragment.arSceneView.scene.addChild(anchorNode)
    node.select()
}

12. Final Output

That’s it! We have built a fully functional Android AR app. You can check the entire source code on github.

120 Views

How our project will looks like? Here we go

Step 1:

Create a project and add dependencies in Android Studio.

dependencies {
    implementation 'com.google.firebase:firebase-core:16.0.8'
    implementation 'com.google.firebase:firebase-messaging:17.4.0'
    implementation 'com.google.firebase:firebase-database:16.1.0'
    implementation 'com.google.firebase:firebase-auth:16.2.0'
    implementation 'com.github.joielechong:countrycodepicker:2.1.8'
    implementation 'com.github.GoodieBag:Pinview:v1.3'
}

Step 2:

Step 3:

Step 4:

Step 5: