Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
The table of contents is too big for display.
Diff view
Diff view
  •  
  •  
  •  
24 changes: 24 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -93,6 +93,15 @@ You get 1,000 randomly generated images with random text on them like:

By default, they will be generated to `out/` in the current working directory.


### Testing (using python)
The command that used to prepare test image (ground truth) . <br>
`python3 run.py -c 1 -w 11 -i texts/hu_test.txt --name_format 0 --output_dir "out3/" -f 64 --thread_count 8 --font_dir fonts/hu_test/ `
To test uints :
```
cd TextRecognitionDataGeneratorHuMu23
python3 tests.py
```
### Text skewing

What if you want random skewing? Add `-k` and `-rk` (`trdg -c 1000 -w 5 -f 64 -k 5 -rk`)
Expand Down Expand Up @@ -150,6 +159,20 @@ The text is chosen at random in a dictionary file (that can be found in the *dic

There are a lot of parameters that you can tune to get the results you want, therefore I recommend checking out `trdg -h` for more information.

## Create images with Hungarain text

It is simple! Just do `trdg -l hu -c 1000 -w 5`!

Generated texts come both in simplified and traditional Hungarain scripts.

Traditional:

![30](samples/30.jpg "0")

Simplified:

![31](samples/31.jpg "1")

## Create images with Chinese text

It is simple! Just do `trdg -l cn -c 1000 -w 5`!
Expand Down Expand Up @@ -184,6 +207,7 @@ The script picks a font at random from the *fonts* directory.
| fonts/ko | Korean |
| fonts/ja | Japanese |
| fonts/th | Thai |
| fonts/hu | Hungarian |

Simply add/remove fonts until you get the desired output.

Expand Down
33 changes: 33 additions & 0 deletions fonts_testing.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
from fontTools.ttLib import TTFont
from fontTools.unicode import Unicode
import os

def has_glyph(font, glyph):
return any(ord(glyph) in table.cmap.keys() for table in font['cmap'].tables)

list_of_s_char = ['a','á', 'b', 'c', 's', 'd', 'z', 'e', 'é', 'f', 'g' ,'g','y', 'h' ,'i', 'í','j', 'k', 'l', 'm', 'n', 'o', 'ó' ,'ö' ,'ő', 'p', 'q' ,'r', 't' ,'u' ,'ú' ,'ü', 'ű', 'v' ,'w' ,'x' ,'y', 'z']
list_of_c_char = ['Ä','Á','B', 'C', 'D', 'E', 'É', 'F', 'G', 'H', 'I', 'Í', 'J' ,'K', 'L' , 'M' ,'N', 'O', 'Ó', 'Ö', 'Ő', 'P', 'Q', 'R', 'S', 'T', 'U', 'Ú', 'Ü' ,'Ű' ,'V' ,'W', 'X' ,'Y','Z']

path = '/home/ngyongyossy/mohammad/trdghm/TextRecognitionDataGeneratorHuMu23/trdg/fonts/hu/'
fonts_list = os.listdir(path)
print(f'We are testing : {len(fonts_list)} fonts', fonts_list)


# 1- Test small letters case
print('Test small letters case\n')
for font in range(len(fonts_list)):
print('We are Testing font :: ',fonts_list[font])
font = TTFont(path+fonts_list[font])

for idx in list_of_s_char:
print(idx, has_glyph(font,idx ))


# 2 - Test captial letters case
print('2 - Test captial letters case')
for font in range(len(fonts_list)):
print('We are Testing font :: ',fonts_list[font])
font = TTFont(path + fonts_list[font])

for idx in list_of_c_char:
print(idx, has_glyph(font,idx ))
2 changes: 1 addition & 1 deletion requirements-hw.txt
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,6 @@ opencv-python>=4.2.0.32
tqdm>=4.23.0
beautifulsoup4>=4.6.0
diffimg==0.2.3
tensorflow>=1.13.1,<1.14
matplotlib>=3.0.2
seaborn>=0.9.0
tensorflow=1.13.1,<1.14
Binary file added samples/30.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added samples/31.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added samples/32.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added samples/33.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading