Human Generated Data

Title

Firdawsi Presents His Work to Mahmud

Date

1935

People

Artist: Bihzad, Persian 20th century

Classification

Paintings

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of John Goelet, 1958.244

Human Generated Data

Title

Firdawsi Presents His Work to Mahmud

People

Artist: Bihzad, Persian 20th century

Date

1935

Classification

Paintings

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of John Goelet, 1958.244

Machine Generated Data

Tags

Amazon
created on 2020-04-24

Person 99.1
Human 99.1
Person 98.3
Person 97.2
Building 91.2
Architecture 91.2
Furniture 89.5
Person 88
Art 80.7
Painting 77.2
Drawing 68.8
Worship 58.8
Temple 58.8
Shrine 58.8
Pillar 55.6
Column 55.6
Throne 55.2

Clarifai
created on 2020-04-24

people 100
adult 99.3
group 98.3
woman 97.9
two 97.4
man 97.4
furniture 96.9
home 96.6
seat 96.5
print 96
sit 95.3
leader 95.1
room 93.6
illustration 93.2
child 92.7
easy chair 92.4
engraving 91.8
art 90.8
one 90.7
three 90.2

Imagga
created on 2020-04-24

mosaic 80.6
transducer 44.7
art 36.9
electrical device 33.5
old 27.9
device 24.9
grunge 23.8
ancient 22.5
texture 21.5
vintage 21.5
architecture 19
design 18.6
antique 18.5
retro 17.2
city 15.8
tile 15.1
pattern 15
wall 14.9
creation 14.6
culture 14.5
aged 14.5
decoration 14.5
frame 14.1
stone 14
building 14
temple 13.8
religion 13.4
door 12.9
carving 12.8
history 12.5
travel 12
dirty 11.7
instrumentality 11.2
entrance 10.6
paper 10.2
paint 10
gold 9.9
sculpture 9.7
ornamental 9.5
structure 9.5
religious 9.4
style 8.9
color 8.9
brown 8.8
artistic 8.7
skyline 8.5
brick 8.5
wallpaper 8.4
wood 8.3
window 8.3
letter 8.2
artwork 8.2
rough 8.2
graphic 8
home 8
decor 8
doorway 7.9
gate 7.8
messy 7.7
house 7.6
traditional 7.5
historic 7.3
throne 7.3
new 7.3
painting 7.2
black 7.2
material 7.1
textured 7

Google
created on 2020-04-24

Microsoft
created on 2020-04-24

text 94.6
black and white 76.6
cartoon 74.5
person 61.9
drawing 53.2
old 46.7
altar 14.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 19-31
Gender Male, 54.4%
Happy 45.7%
Surprised 48.5%
Sad 45%
Calm 45.1%
Fear 49.9%
Angry 45.5%
Disgusted 45.1%
Confused 45.2%

AWS Rekognition

Age 13-25
Gender Male, 53%
Disgusted 45.9%
Fear 45.3%
Surprised 50.8%
Angry 45.2%
Happy 46%
Calm 46.5%
Confused 45.2%
Sad 45.1%

AWS Rekognition

Age 9-19
Gender Male, 53.9%
Happy 51.9%
Disgusted 45.4%
Confused 45.1%
Surprised 45.3%
Calm 46%
Fear 45.2%
Angry 45.9%
Sad 45.2%

AWS Rekognition

Age 17-29
Gender Female, 50.3%
Calm 54.2%
Happy 45%
Angry 45.2%
Surprised 45.1%
Fear 45%
Sad 45.1%
Confused 45.1%
Disgusted 45.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Painting 77.2%

Categories

Text analysis

Google

生144手
144