Human Generated Data

Title

KNEELING WOMAN

Date

-

People

Artist: Hidechika (Chounsai), Japanese

Classification

Sculpture

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of Helene K. Suermondt, 1973.13

Human Generated Data

Title

KNEELING WOMAN

People

Artist: Hidechika (Chounsai), Japanese

Classification

Sculpture

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of Helene K. Suermondt, 1973.13

Machine Generated Data

Tags

Amazon
created on 2022-06-10

Figurine 99.4
Human 97.9
Person 94.2
Kneeling 88.5
Sunglasses 79.9
Accessories 79.9
Accessory 79.9

Clarifai
created on 2023-10-29

people 99.3
portrait 99.3
monochrome 99.1
one 98.7
adult 97.1
wear 96.9
woman 96.8
man 95.7
girl 93.7
sitting 91.8
music 91.4
child 91.2
model 89.7
sit 89.7
black and white 89.6
facial expression 89.2
fashion 89.1
nude 86.6
art 86.6
son 85

Imagga
created on 2022-06-10

child 41.8
person 34.6
people 27.9
portrait 27.8
happy 26.9
adult 25.2
cute 22.9
smile 21.4
military uniform 20.6
happiness 20.4
attractive 20.3
boy 19.8
fashion 19.6
face 18.5
model 17.9
juvenile 17.8
lifestyle 17.3
clothing 17
uniform 16.7
joy 16.7
man 15.5
youth 15.3
casual 15.2
family 15.1
little 15
pretty 14.7
black 14.5
childhood 14.3
smiling 13.7
expression 13.6
human 13.5
hair 13.5
male 13.3
fun 12.7
parent 12.7
kid 12.4
modern 11.9
women 11.9
teenager 11.8
love 11.8
fitness 11.7
mother 11.6
father 11.6
studio 11.4
lady 11.4
baseball glove 11.3
cheerful 10.6
sport 10.5
one 10.4
play 10.3
covering 10.2
action 10.2
girls 10
leisure 10
style 9.6
sexy 9.6
look 9.6
body 9.6
standing 9.6
head 9.2
holding 9.1
son 9.1
active 9
consumer goods 9
posing 8.9
dad 8.8
together 8.8
brunette 8.7
eyes 8.6
teen 8.3
outdoors 8.2
healthy 8.2
exercise 8.2
daughter 8.1
athlete 8
brother 7.8
statue 7.8
elegance 7.6
baby 7.5
joyful 7.3
playing 7.3
dress 7.2
aviator 7.2
astronaut 7.2
innocent 7

Google
created on 2022-06-10

Microsoft
created on 2022-06-10

human face 96
person 91.2
text 90.2
clothing 88.7
sketch 84.4
toddler 78.4
baby 74.5
drawing 73.8
black and white 69.5
black 66.4
player 62.7
white 62.5
boy 61.4
posing 57.6
vintage 45.2
old 44.6
image 36.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 10-18
Gender Female, 99.3%
Surprised 99.7%
Fear 5.9%
Sad 2.2%
Happy 0.1%
Angry 0%
Confused 0%
Calm 0%
Disgusted 0%

AWS Rekognition

Age 6-12
Gender Male, 99.9%
Surprised 99.1%
Fear 6.8%
Calm 4.7%
Confused 2.5%
Sad 2.3%
Angry 1.7%
Happy 1.2%
Disgusted 0.9%

Microsoft Cognitive Services

Age 41
Gender Female

Microsoft Cognitive Services

Age 1
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 94.2%
Sunglasses 79.9%

Categories

Imagga

paintings art 99.9%

Text analysis

Google

affe