1

I have developed an application for barcode decoding in android using Google vision Library for GS1 data matrix and Zbar Library for GS1 128 barcode Unable to read FNC1 character at the first position of a GS1 128 Barcode using Zbar library.

The Zbar library is unable to display any sign of FNC1 character at the start of the Barcode!

Any Solutions. . . .

Instant Help is appreciable . . .

Below is my ZBar Scanner Activity

 @SuppressWarnings("deprecation")
 public class ZBarFirstScannerActivity extends AppCompatActivity{

//TextView tv;
ImageView iv;
LinearLayout ll;
private Camera mCamera;
private CameraPreview mPreview;
private Handler autoFocusHandler;
private ImageScanner scanner;
private boolean barcodeScanned = false;
private boolean previewing = true;
TextView tv;

static {
    System.loadLibrary("iconv");
}
static {
    System.loadLibrary("zbarjni");
}



public void onCreate(Bundle savedInstanceState)
{
    super.onCreate(savedInstanceState);



    setContentView(R.layout.barcode_capture1d);


    tv = (TextView) findViewById(R.id.textVertical);
    tv.setRotation(90);

    initToolbar();


    autoFocusHandler = new Handler();
    mCamera = getCameraInstance();
    // Instance barcode scanner

    scanner = new ImageScanner();
    scanner.setConfig(0, Config.X_DENSITY, 1);
    scanner.setConfig(0, Config.Y_DENSITY, 1);
    scanner.setConfig(Symbol.CODE128, Config.ENABLE,1);
    scanner.setConfig(Symbol.EAN13, Config.ENABLE,1);

    mPreview = new CameraPreview(this, mCamera, previewCb, autoFocusCB);
    FrameLayout preview = (FrameLayout)findViewById(R.id.cameraPreview);
    preview.addView(mPreview);


}

private void initToolbar() {

    final Toolbar toolbar = (Toolbar) findViewById(R.id.toolbar);
    setSupportActionBar(toolbar);
    final ActionBar actionBar = getSupportActionBar();

    if (actionBar != null) {


        actionBar.setHomeButtonEnabled(true);
        actionBar.setHomeAsUpIndicator(ContextCompat.getDrawable(this, R.drawable.abc_ic_ab_back_mtrl_am_alpha));

        actionBar.setDisplayHomeAsUpEnabled(true);
    }
}
/** A safe way to get an instance of the Camera object. */
public static Camera getCameraInstance()
{
    Camera c = null;
    try
    {
        c = Camera.open();
    } catch (Exception e)
    {
        //nada
    }
    return c;
}

private void releaseCamera()
{
    if (mCamera != null)
    {
        previewing = false;
        mCamera.setPreviewCallback(null);
        mCamera.release();
        mCamera = null;
    }
}

PreviewCallback previewCb = new PreviewCallback()
{
    public void onPreviewFrame(byte[] data, Camera camera)
    {
        Camera.Parameters parameters = camera.getParameters();
        Size size = parameters.getPreviewSize();

        Image barcode = new Image(size.width, size.height, "Y800");
        barcode.setData(data);

        int result = scanner.scanImage(barcode);
        if (result != 0)
        {
            previewing = false;
            mCamera.setPreviewCallback(null);
            mCamera.stopPreview();
            SymbolSet syms = scanner.getResults();
            for (Symbol sym : syms)
            {
                barcodeScanned = true;

                Intent returnIntent = new Intent();
                returnIntent.putExtra("BARCODE", sym.getData());
                setResult(MainActivity.BAR_CODE_TYPE_128,returnIntent);
                releaseCamera();
                finish();
                break;
            }
        }
    }
};

// Mimic continuous auto-focusing
AutoFocusCallback autoFocusCB = new AutoFocusCallback()
{
    public void onAutoFocus(boolean success, Camera camera)
    {
        autoFocusHandler.postDelayed(doAutoFocus, 3000);
    }
};

private Runnable doAutoFocus = new Runnable()
{
    public void run()
    {
        if (previewing)
            mCamera.autoFocus(autoFocusCB);
    }
};

public void onPause() {
    super.onPause();
    releaseCamera();
}

public void onResume(){
    super.onResume();
    new ZBarFirstScannerActivity();

}

@Override
public void onBackPressed() {

    releaseCamera();

    finish();
}

@Override
public boolean onOptionsItemSelected(MenuItem item) {
    int id = item.getItemId();

    if (id == android.R.id.home) {
        onBackPressed();
        return true;
    }
    return super.onOptionsItemSelected(item);
}
}

Below is my Google Scanner Activity

public final class GoogleScannerActivity extends AppCompatActivity {
private static final String TAG = "Barcode-reader";

// intent request code to handle updating play services if needed.
private static final int RC_HANDLE_GMS = 9001;

// permission request codes need to be < 256
private static final int RC_HANDLE_CAMERA_PERM = 2;

// constants used to pass extra data in the intent
public static final String AutoFocus = "AutoFocus";
public static final String UseFlash = "UseFlash";
public static final String BarcodeObject = "Barcode";
Bitmap bmp;
FileOutputStream fos = null;
private Camera c;

Switch aSwitch;
private CameraSource mCameraSource;
private CameraSourcePreview mPreview;
private GraphicOverlay<BarcodeGraphic> mGraphicOverlay;

// helper objects for detecting taps and pinches.
private ScaleGestureDetector scaleGestureDetector;
private GestureDetector gestureDetector;

/**
 * Initializes the UI and creates the detector pipeline.
 */
@Override
public void onCreate(Bundle icicle) {
    super.onCreate(icicle);
    setContentView(R.layout.barcode_capture2d);
    initToolbar();

    ActivitySource.caller = this;
    mPreview = (CameraSourcePreview) findViewById(R.id.preview);
    mGraphicOverlay = (GraphicOverlay<BarcodeGraphic>) findViewById(R.id.graphicOverlay);

    boolean autoFocus = true;
    boolean useFlash = false;

    // Check for the camera permission before accessing the camera.  If the
    // permission is not granted yet, request permission.
    int rc = ActivityCompat.checkSelfPermission(this, Manifest.permission.CAMERA);
    if (rc == PackageManager.PERMISSION_GRANTED) {
        createCameraSource(autoFocus, useFlash);
    } else {
        requestCameraPermission();
    }

    gestureDetector = new GestureDetector(this, new CaptureGestureListener());
    scaleGestureDetector = new ScaleGestureDetector(this, new ScaleListener());

    /*Snackbar.make(mGraphicOverlay, "Tap to capture. Pinch/Stretch to zoom",
            Snackbar.LENGTH_LONG)
            .show();*/
}

private void initToolbar() {

    final Toolbar toolbar = (Toolbar) findViewById(R.id.toolbar);
    setSupportActionBar(toolbar);
    final ActionBar actionBar = getSupportActionBar();

    if (actionBar != null) {


        actionBar.setHomeButtonEnabled(true);
        actionBar.setHomeAsUpIndicator(ContextCompat.getDrawable(this, R.drawable.abc_ic_ab_back_mtrl_am_alpha));

        actionBar.setDisplayHomeAsUpEnabled(true);
    }
}

private Camera.Size getBestPreviewSize(int width, int height, Camera.Parameters parameters){
    Camera.Size bestSize = null;
    List<Camera.Size> sizeList = parameters.getSupportedPreviewSizes();

    bestSize = sizeList.get(0);

    for(int i = 1; i < sizeList.size(); i++){
        if((sizeList.get(i).width * sizeList.get(i).height) >
                (bestSize.width * bestSize.height)){
            bestSize = sizeList.get(i);
        }
    }
    return bestSize;
}
/**
 * Handles the requesting of the camera permission.  This includes
 * showing a "Snackbar" message of why the permission is needed then
 * sending the request.
 */
private void requestCameraPermission() {
    Log.w(TAG, "Camera permission is not granted. Requesting permission");

    final String[] permissions = new String[]{Manifest.permission.CAMERA};

    if (!ActivityCompat.shouldShowRequestPermissionRationale(this,
            Manifest.permission.CAMERA)) {
        ActivityCompat.requestPermissions(this, permissions, RC_HANDLE_CAMERA_PERM);
        return;
    }

    final Activity thisActivity = this;

    View.OnClickListener listener = new View.OnClickListener() {
        @Override
        public void onClick(View view) {
            ActivityCompat.requestPermissions(thisActivity, permissions,
                    RC_HANDLE_CAMERA_PERM);
        }
    };

    Snackbar.make(mGraphicOverlay, R.string.permission_camera_rationale,
            Snackbar.LENGTH_INDEFINITE)
            .setAction(R.string.ok, listener)
            .show();
}

@Override
public boolean onTouchEvent(MotionEvent e) {
    boolean b = scaleGestureDetector.onTouchEvent(e);

    boolean c = gestureDetector.onTouchEvent(e);

    return b || c || super.onTouchEvent(e);
}

/**
 * Creates and starts the camera.  Note that this uses a higher resolution in comparison
 * to other detection examples to enable the barcode detector to detect small barcodes
 * at long distances.
 *
 * Suppressing InlinedApi since there is a check that the minimum version is met before using
 * the constant.
 */
@SuppressLint("InlinedApi")
private void createCameraSource(boolean autoFocus, boolean useFlash) {
    Context context = getApplicationContext();

    // A barcode detector is created to track barcodes.  An associated multi-processor instance
    // is set to receive the barcode detection results, track the barcodes, and maintain
    // graphics for each barcode on screen.  The factory is used by the multi-processor to
    // create a separate tracker instance for each barcode.

    BarcodeDetector barcodeDetector = new BarcodeDetector.Builder(context).setBarcodeFormats(Barcode.CODE_128 | Barcode.DATA_MATRIX | Barcode.QR_CODE).build();
    BarcodeTrackerFactory barcodeFactory = new BarcodeTrackerFactory(mGraphicOverlay);
    barcodeDetector.setProcessor(
            new MultiProcessor.Builder<>(barcodeFactory).build());

    if (!barcodeDetector.isOperational()) {
        // Note: The first time that an app using the barcode or face API is installed on a
        // device, GMS will download a native libraries to the device in order to do detection.
        // Usually this completes before the app is run for the first time.  But if that
        // download has not yet completed, then the above call will not detect any barcodes
        // and/or faces.
        //
        // isOperational() can be used to check if the required native libraries are currently
        // available.  The detectors will automatically become operational once the library
        // downloads complete on device.
        Log.w(TAG, "Detector dependencies are not yet available.");

        // Check for low storage.  If there is low storage, the native library will not be
        // downloaded, so detection will not become operational.
        IntentFilter lowstorageFilter = new IntentFilter(Intent.ACTION_DEVICE_STORAGE_LOW);
        boolean hasLowStorage = registerReceiver(null, lowstorageFilter) != null;

        if (hasLowStorage) {
            Toast.makeText(this, R.string.low_storage_error, Toast.LENGTH_LONG).show();
            Log.w(TAG, getString(R.string.low_storage_error));
        }
    }

    // Creates and starts the camera.  Note that this uses a higher resolution in comparison
    // to other detection examples to enable the barcode detector to detect small barcodes
    // at long distances.
    CameraSource.Builder builder = new CameraSource.Builder(getApplicationContext(), barcodeDetector)
            .setFacing(CameraSource.CAMERA_FACING_BACK)
            .setRequestedPreviewSize(1100, 844)
            .setRequestedFps(15.0f);
    // make sure that auto focus is an available option
    if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.ICE_CREAM_SANDWICH) {
        builder = builder.setFocusMode(
                autoFocus ? Camera.Parameters.FOCUS_MODE_CONTINUOUS_PICTURE : null);
    }

    mCameraSource = builder
            .setFlashMode(useFlash ? Camera.Parameters.FLASH_MODE_TORCH : null)
            .build();
}


/**
 * Restarts the camera.
 */
@Override
protected void onResume() {
    super.onResume();
    startCameraSource();
}

/**
 * Stops the camera.
 */
@Override
protected void onPause() {
    super.onPause();
    if (mPreview != null) {
        mPreview.stop();
    }
}

/**
 * Releases the resources associated with the camera source, the associated detectors, and the
 * rest of the processing pipeline.
 */
@Override
protected void onDestroy() {
    super.onDestroy();
    if (mPreview != null) {
        mPreview.release();
    }
}


@Override
public void onRequestPermissionsResult(int requestCode,
                                       @NonNull String[] permissions,
                                       @NonNull int[] grantResults) {
    if (requestCode != RC_HANDLE_CAMERA_PERM) {
        Log.d(TAG, "Got unexpected permission result: " + requestCode);
        super.onRequestPermissionsResult(requestCode, permissions, grantResults);
        return;
    }

    if (grantResults.length != 0 && grantResults[0] == PackageManager.PERMISSION_GRANTED) {
        Log.d(TAG, "Camera permission granted - initialize the camera source");
        // we have permission, so create the camerasource
        boolean autoFocus = getIntent().getBooleanExtra(AutoFocus,false);
        boolean useFlash = getIntent().getBooleanExtra(UseFlash, false);
        createCameraSource(autoFocus, useFlash);
        return;
    }

    Log.e(TAG, "Permission not granted: results len = " + grantResults.length +
            " Result code = " + (grantResults.length > 0 ? grantResults[0] : "(empty)"));

    DialogInterface.OnClickListener listener = new DialogInterface.OnClickListener() {
        public void onClick(DialogInterface dialog, int id) {
            finish();
        }
    };

    AlertDialog.Builder builder = new AlertDialog.Builder(this);
    builder.setTitle("Multitracker sample")
            .setMessage(R.string.no_camera_permission)
            .setPositiveButton(R.string.ok, listener)
            .show();
}

/**
 * Starts or restarts the camera source, if it exists.  If the camera source doesn't exist yet
 * (e.g., because onResume was called before the camera source was created), this will be called
 * again when the camera source is created.
 */
private void startCameraSource() throws SecurityException {
    // check that the device has play services available.
    int code = GoogleApiAvailability.getInstance().isGooglePlayServicesAvailable(
            getApplicationContext());
    if (code != ConnectionResult.SUCCESS) {
        Dialog dlg =
                GoogleApiAvailability.getInstance().getErrorDialog(this, code, RC_HANDLE_GMS);
        dlg.show();
    }

    if (mCameraSource != null) {
        try {
            mPreview.start(mCameraSource, mGraphicOverlay);
        } catch (IOException e) {
            Log.e(TAG, "Unable to start camera source.", e);
            mCameraSource.release();
            mCameraSource = null;
        }
    }
}

/**
 * onTap is called to capture the oldest barcode currently detected and
 * return it to the caller.
 *
 * @param rawX - the raw position of the tap
 * @param rawY - the raw position of the tap.
 * @return true if the activity is ending.
 */

private boolean onTap(float rawX, float rawY) {
    //TODO: use the tap position to select the barcode.
    BarcodeGraphic graphic = mGraphicOverlay.getFirstGraphic();
    Barcode barcode = null;
    if (graphic != null) {
        barcode = graphic.getBarcode();
        if (barcode != null) {
            Intent data = new Intent();
            data.putExtra(BarcodeObject, barcode);
            setResult(CommonStatusCodes.SUCCESS, data);
            finish();
        }
        else {
            Log.d(TAG, "barcode data is null");
        }
    }
    else {
        Log.d(TAG,"no barcode detected");
    }
    return barcode != null;
}

private class CaptureGestureListener extends GestureDetector.SimpleOnGestureListener {

    @Override
    public boolean onSingleTapConfirmed(MotionEvent e) {

        return onTap(e.getRawX(), e.getRawY()) || super.onSingleTapConfirmed(e);
    }
}

private class ScaleListener implements ScaleGestureDetector.OnScaleGestureListener {

    /**
     * Responds to scaling events for a gesture in progress.
     * Reported by pointer motion.
     *
     * @param detector The detector reporting the event - use this to
     *                 retrieve extended info about event state.
     * @return Whether or not the detector should consider this event
     * as handled. If an event was not handled, the detector
     * will continue to accumulate movement until an event is
     * handled. This can be useful if an application, for example,
     * only wants to update scaling factors if the change is
     * greater than 0.01.
     */
    @Override
    public boolean onScale(ScaleGestureDetector detector) {
        return false;
    }

    /**
     * Responds to the beginning of a scaling gesture. Reported by
     * new pointers going down.
     *
     * @param detector The detector reporting the event - use this to
     *                 retrieve extended info about event state.
     * @return Whether or not the detector should continue recognizing
     * this gesture. For example, if a gesture is beginning
     * with a focal point outside of a region where it makes
     * sense, onScaleBegin() may return false to ignore the
     * rest of the gesture.
     */
    @Override
    public boolean onScaleBegin(ScaleGestureDetector detector) {
        return true;
    }

    /**
     * Responds to the end of a scale gesture. Reported by existing
     * pointers going up.
     * <p/>
     * Once a scale has ended, {@link ScaleGestureDetector#getFocusX()}
     * and {@link ScaleGestureDetector#getFocusY()} will return focal point
     * of the pointers remaining on the screen.
     *
     * @param detector The detector reporting the event - use this to
     *                 retrieve extended info about event state.
     */
    @Override
    public void onScaleEnd(ScaleGestureDetector detector) {
        mCameraSource.doZoom(detector.getScaleFactor());
    }
}

@Override
public boolean onOptionsItemSelected(MenuItem item) {
    int id = item.getItemId();

    if (id == android.R.id.home) {
        onBackPressed();
        return true;
    }
    return super.onOptionsItemSelected(item);
}
}
Terry Burton
  • 2,801
  • 1
  • 29
  • 41
Akash Dubey
  • 1,508
  • 17
  • 34

1 Answers1

0

When scanning a GS1-128 symbol the FNC1 in first position acts as a flag character to indicate the presence of GS1 Application Identifier standard format data and is intentionally omitted from the scanned data, whilst any inter-field FNC1 formatting character is transmitted at GS (ASCII 29).

The implicit, leading FNC1 can be inferred if your reader is configured to emit symbology identifiers at the start of the scanned data. In this case your GS1-128 scanned data will begin with ]C1 rather than ]C0 for generic Code 128.

Unfortunately it does not appear as though either the ZBar library or Google Vision Library can be configured to return symbology identifiers which is a disappointing limitation. Additionally the Google Vision Library erroneously returns a leading GS1 representing the FNC1 in first position.

Reading of GS1 formatted data is described in detail by this answer.

Specifically the ISO/IEC 15417 - Code 128 bar code symbology specification says:

"Any application which utilizes Code 128 symbols with FNC1 in the first or second data position should require the transmission of symbology identifiers to be enabled. When FNC1 is used in the first or second position it shall not be represented in the transmitted message, although its presence is indicated by the use of modifier values 1 or 2 respectively in the symbology identifier."

Community
  • 1
  • 1
Terry Burton
  • 2,801
  • 1
  • 29
  • 41
  • You are assuming we know more about your application than you are presenting here. You need to provide way more relevant detail if you expect an "instant" and useful workaround. How is it that your workflow needs to see an FNC1 character that by definition should be invisible? How does your ZBar input for GS1-128 essential differ from your Google Vision Library input for GS1 Data Matrix? – Terry Burton Aug 26 '16 at 08:45
  • For Scanning GS1 Data matrix, I am using Google's vision library which is providing GS at first position but ZBar is not providing it while scanning GS1 128 Barcode. – Akash Dubey Sep 16 '16 at 13:57
  • From your description it would appear that the Google Vision Library is incorrect as it should not in fact be providing an initial GS1 representing the FNC1 in first position. However, to be of any use for your purposes each library should provide a means to identify the symbol sub-type (GS1 DataMatrix vs DataMatrix, etc.) from which the data was read, either by providing a standards-compliant symbology identifier or by providing some form of "GS1 structured data detected" hint. A quick look at the API reference for each project does not indicate the presence of such features. – Terry Burton Sep 16 '16 at 21:19
  • Can you suggest me any open source Library that provides an FNC1 character at first position for GS1 128 Bar code?? – Akash Dubey Sep 19 '16 at 12:21
  • You are unreasonably asking for a library that has a bug compatible with your incorrect expectations, i.e. converts a leading FNC1 to GS contrary to the specification. Even if you find such a library your solution will be fragile since the bug may eventually be fixed breaking future builds of your app. – Terry Burton Sep 19 '16 at 14:20
  • Instead (in the absence of each library indicating the presence of GS1 data) you should work around the Google Vision bug by stripping the leading GS character from the scanned data. This will give you parity between the two libraries. You should then use a heuristic or explicit parser to determine whether the scanned data is in GS1 format, rejecting the scan is it isn't valid structured data. If your subsequent data processing needs require a leading GS character then you can reintroduce it after this initial validation phase. – Terry Burton Sep 19 '16 at 14:25
  • If you choose to parse the input to determine whether it is valid GS1 data then the following will help you to write this code. http://stackoverflow.com/a/31760872/2568535 – Terry Burton Sep 19 '16 at 14:29
  • Here. It is necessary to get FNC1 at the first position to prove that the Barcode is of GS1 128 symbology. That's why I need a library that provides me that weather the FNC1 is encoded or not in the barcode and if encoded then it should be there in the decoded value as per ]c or as per any ascii 29 or as per anything that indicates the presence of FNC1 at first position – Akash Dubey Sep 20 '16 at 06:53
  • 1
    Correct, and I'm saying that you are out of luck in this regard so you will have to resort to a heuristic. Or extend/improve the libraries yourself... – Terry Burton Sep 20 '16 at 11:36
  • Can You Suggest me something regarding that!! Because I don't even no from where to start!! – Akash Dubey Sep 21 '16 at 05:18
  • Your haven't demonstrate an attempt to understand or solve the problem yourself given what you've learned here. Nobody is likely to write your code for you so you might be better to find someone with this ability and pay them for their expertise. From a non-programming perspective you could always raise bugs against these projects pointing them to this discussion for context. 1: GV should not transmit a leading FNC1. 2/3: GV/ZBar APIs should return a symbology identier that indicates GS1 formatted data (FNC1 in first). – Terry Burton Sep 21 '16 at 12:27