1# Integrate BERT question answerer
2
3The Task Library `BertQuestionAnswerer` API loads a Bert model and answers
4questions based on the content of a given passage. For more information, see the
5documentation for the Question-Answer model
6<a href="../../models/bert_qa/overview.md">here</a>.
7
8## Key features of the BertQuestionAnswerer API
9
10*   Takes two text inputs as question and context and outputs a list of possible
11    answers.
12
13*   Performs out-of-graph Wordpiece or Sentencepiece tokenizations on input
14    text.
15
16## Supported BertQuestionAnswerer models
17
18The following models are compatible with the `BertNLClassifier` API.
19
20*   Models created by
21    [TensorFlow Lite Model Maker for BERT Question Answer](https://www.tensorflow.org/lite/tutorials/model_maker_question_answer).
22
23*   The
24    [pretrained BERT models on TensorFlow Hub](https://tfhub.dev/tensorflow/collections/lite/task-library/bert-question-answerer/1).
25
26*   Custom models that meet the
27    [model compatibility requirements](#model-compatibility-requirements).
28
29## Run inference in Java
30
31### Step 1: Import Gradle dependency and other settings
32
33Copy the `.tflite` model file to the assets directory of the Android module
34where the model will be run. Specify that the file should not be compressed, and
35add the TensorFlow Lite library to the module’s `build.gradle` file:
36
37```java
38android {
39    // Other settings
40
41    // Specify tflite file should not be compressed for the app apk
42    aaptOptions {
43        noCompress "tflite"
44    }
45
46}
47
48dependencies {
49    // Other dependencies
50
51    // Import the Task Text Library dependency
52    implementation 'org.tensorflow:tensorflow-lite-task-text:0.1.0'
53}
54```
55
56### Step 2: Run inference using the API
57
58```java
59// Initialization
60BertQuestionAnswerer answerer = BertQuestionAnswerer.createFromFile(androidContext, modelFile);
61
62// Run inference
63List<QaAnswer> answers = answerer.answer(contextOfTheQuestion, questionToAsk);
64```
65
66See the
67[source code](https://github.com/tensorflow/tflite-support/blob/master/tensorflow_lite_support/java/src/java/org/tensorflow/lite/task/text/qa/BertQuestionAnswerer.java)
68for more details.
69
70## Run inference in Swift
71
72### Step 1: Import CocoaPods
73
74Add the TensorFlowLiteTaskText pod in Podfile
75
76```
77target 'MySwiftAppWithTaskAPI' do
78  use_frameworks!
79  pod 'TensorFlowLiteTaskText', '~> 0.0.1-nightly'
80end
81```
82
83### Step 2: Run inference using the API
84
85```swift
86// Initialization
87let mobileBertAnswerer = TFLBertQuestionAnswerer.questionAnswerer(
88      modelPath: mobileBertModelPath)
89
90// Run inference
91let answers = mobileBertAnswerer.answer(
92      context: context, question: question)
93```
94
95See the
96[source code](https://github.com/tensorflow/tflite-support/blob/master/tensorflow_lite_support/ios/task/text/qa/Sources/TFLBertQuestionAnswerer.h)
97for more details.
98
99## Run inference in C++
100
101Note: we are working on improving the usability of the C++ Task Library, such as
102providing prebuilt binaries and creating user-friendly workflows to build from
103source code. The C++ API may be subject to change.
104
105```c++
106// Initialization
107std::unique_ptr<BertQuestionAnswerer> answerer = BertQuestionAnswerer::CreateFromFile(model_file).value();
108
109// Run inference
110std::vector<QaAnswer> positive_results = answerer->Answer(context_of_question, question_to_ask);
111```
112
113See the
114[source code](https://github.com/tensorflow/tflite-support/blob/master/tensorflow_lite_support/cc/task/text/qa/bert_question_answerer.h)
115for more details.
116
117## Example results
118
119Here is an example of the answer results of
120[ALBERT model](https://tfhub.dev/tensorflow/lite-model/albert_lite_base/squadv1/1).
121
122Context: "The Amazon rainforest, alternatively, the Amazon Jungle, also known in
123English as Amazonia, is a moist broadleaf tropical rainforest in the Amazon
124biome that covers most of the Amazon basin of South America. This basin
125encompasses 7,000,000 km2 (2,700,000 sq mi), of which
1265,500,000 km2 (2,100,000 sq mi) are covered by the rainforest. This region
127includes territory belonging to nine nations."
128
129Question: "Where is Amazon rainforest?"
130
131Answers:
132
133```
134answer[0]:  'South America.'
135logit: 1.84847, start_index: 39, end_index: 40
136answer[1]:  'most of the Amazon basin of South America.'
137logit: 1.2921, start_index: 34, end_index: 40
138answer[2]:  'the Amazon basin of South America.'
139logit: -0.0959535, start_index: 36, end_index: 40
140answer[3]:  'the Amazon biome that covers most of the Amazon basin of South America.'
141logit: -0.498558, start_index: 28, end_index: 40
142answer[4]:  'Amazon basin of South America.'
143logit: -0.774266, start_index: 37, end_index: 40
144
145```
146
147Try out the simple
148[CLI demo tool for BertQuestionAnswerer](https://github.com/tensorflow/tflite-support/blob/master/tensorflow_lite_support/examples/task/text/desktop/README.md#bert-question-answerer)
149with your own model and test data.
150
151## Model compatibility requirements
152
153The `BertQuestionAnswerer` API expects a TFLite model with mandatory
154[TFLite Model Metadata](../../convert/metadata.md).
155
156The Metadata should meet the following requirements:
157
158*   `input_process_units` for Wordpiece/Sentencepiece Tokenizer
159
160*   3 input tensors with names "ids", "mask" and "segment_ids" for the output of
161    the tokenizer
162
163*   2 output tensors with names "end_logits" and "start_logits" to indicate the
164    answer's relative position in the context
165