Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

On what scale is the text_score parameter from output of readtext() #517

Open
jkcg-learning opened this issue Oct 8, 2021 · 2 comments
Open
Labels
bug Something isn't working

Comments

@jkcg-learning
Copy link

What is the range of text_score parameter from output of readtext()

For some inference, I am seeing the confidence score of text above 1.00 or (100%)

@gaotongxiao
Copy link
Collaborator

It depends on the model that you've been running. We didn't implement a universal range limit on scores so it's only a relative confidence indicator to the same model for now.

@jkcg-learning
Copy link
Author

jkcg-learning commented Oct 11, 2021

I have used the following code for the inference on "demo_text_ocr.jpg" file
It is loading PANET(detection) and SEGOCR(Recognition).

from mmocr.utils.ocr import MMOCR
ocr = MMOCR()
results = ocr.readtext('demo/demo_text_ocr.jpg', print_result=True,details=True, imshow=False)

demo_text_ocr

Output

{'filename': 'demo_text_ocr', 'result': [{'box': [218, 44, 157, 39, 160, 12, 220, 16], 'box_score': 0.9534332156181335, 'text': 'sale', 'text_score': 1.0}, {'box': [66, 60, 33, 60, 33, 43, 66, 43], 'box_score': 0.9697706699371338, 'text': 'sale', 'text_score': 0.9623287671232876}, {'box': [582, 117, 457, 109, 461, 55, 586, 63], 'box_score': 0.9692338109016418, 'text': 'all', 'text_score': 1.3333333333333333}, {'box': [706, 133, 580, 117, 587, 62, 713, 78], 'box_score': 0.9724510908126831, 'text': 'lyear', 'text_score': 1.0}, {'box': [83, 120, 21, 114, 23, 88, 85, 94], 'box_score': 0.975725531578064, 'text': 'sale', 'text_score': 1.0}, {'box': [663, 247, 447, 228, 452, 170, 669, 189], 'box_score': 0.9692399501800537, 'text': 'round', 'text_score': 1.0}, {'box': [514, 301, 462, 294, 464, 277, 516, 283], 'box_score': 0.95651775598526, 'text': 'aretal', 'text_score': 1.0}, {'box': [584, 309, 516, 302, 518, 283, 585, 290], 'box_score': 0.9467790722846985, 'text': 'loran0sg', 'text_score': 0.9693693693693693}, {'box': [584, 308, 584, 288, 637, 288, 637, 308], 'box_score': 0.9175753593444824, 'text': 'oveano', 'text_score': 1.1452991452991454}, {'box': [637, 308, 637, 292, 673, 292, 673, 308], 'box_score': 0.9137787818908691, 'text': 'ostcms', 'text_score': 1.0}, {'box': [496, 320, 449, 312, 452, 294, 500, 302], 'box_score': 0.9457231163978577, 'text': 'soceci', 'text_score': 1.0}, {'box': [587, 331, 499, 323, 501, 303, 588, 311], 'box_score': 0.9494416117668152, 'text': 'cnounsons', 'text_score': 0.9797979797979798}, {'box': [660, 328, 590, 328, 590, 312, 660, 312], 'box_score': 0.951109766960144, 'text': 'scronma', 'text_score': 0.9240506329113923}, {'box': [557, 415, 473, 404, 477, 373, 561, 383], 'box_score': 0.9712486267089844, 'text': 'ocbc', 'text_score': 1.75}, {'box': [560, 414, 560, 385, 620, 385, 620, 414], 'box_score': 0.9603750705718994, 'text': 'bank', 'text_score': 1.25}]}

box_score is this on a scale (0.00 to 1.00) or (0% to 100 %) ????

text_score scale is not much clear. In some cases it is exceeding above 1.00 or 100% .

What shall we conclude about the confidence score for box and text?

@gaotongxiao gaotongxiao added the bug Something isn't working label Nov 12, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants