Human Generated Data

Title

FROG

Date

People
Classification

Sculpture

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of Helene K. Suermondt, 1973.14

Human Generated Data

Title

FROG

People
Date

Classification

Sculpture

Machine Generated Data

Tags

Amazon
created on 2022-06-18

Figurine 98.9
Human 94.3
Person 93.5
Toy 87.8
Person 82.2
Kneeling 72.2

Imagga
created on 2022-06-18

3d 31.8
man 27.3
render 23.4
automaton 16.6
toy 15.4
person 14.4
cute 12.9
skate 12.7
cartoon 12.5
character 12.3
human 11.2
men 11.2
figure 11
robot 10.8
male 10.6
little 10.6
people 10.6
body 10.4
sport 10.3
future 10.2
sculpture 9.9
art 9.9
futuristic 9.9
fink 9.8
pawn 9.2
doll 9
cyborg 8.9
souvenir 8.8
fiction 8.8
holiday 8.6
saltshaker 8.6
two 8.5
modern 8.4
fun 8.2
hat 8.2
technology 8.2
happy 8.1
face 7.8
winter 7.7
statue 7.6
style 7.4
chessman 7.4
object 7.3
reflection 7.3
child 7.1
game 7.1
boy 7.1
conceptual 7.1

Google
created on 2022-06-18

Microsoft
created on 2022-06-18

sketch 93.1
cartoon 91.3
drawing 88.7
black and white 73.6
baby 66
text 60.4
art 57.3
human face 51.9
ceramic ware 26.5
porcelain 9.5

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 7-17
Gender Female, 100%
Calm 21.8%
Angry 19.1%
Happy 14.6%
Surprised 13.2%
Confused 11.4%
Fear 10.1%
Disgusted 8.6%
Sad 5.1%

AWS Rekognition

Age 56-64
Gender Male, 99.3%
Happy 99.7%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Angry 0.1%
Calm 0%
Disgusted 0%
Confused 0%

Microsoft Cognitive Services

Age 1
Gender Female

Feature analysis

Amazon

Person 93.5%
Toy 87.8%

Captions

Microsoft

a group of people posing for a photo 52.5%
a group of people riding on the back of a horse 41.2%
a group of people pose for a photo 41.1%

Text analysis

Amazon

RODAK
YT33A°2
3J13 YT33A°2
3J13