Human Generated Data

Title

Untitled (young boy and girl sitting in chair examining Christmas stockings as older boy watches)

Date

1949

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9237

Human Generated Data

Title

Untitled (young boy and girl sitting in chair examining Christmas stockings as older boy watches)

People

Artist: Martin Schweig, American 20th century

Date

1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9237

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 98.6
Human 98.6
Person 98
Musician 92.2
Musical Instrument 92.2
Leisure Activities 89.5
Guitar 81.4
Guitarist 79
Performer 79
Furniture 75.8
Clothing 75.4
Apparel 75.4
Chair 73.7
Sitting 73.3
Portrait 64.6
Photography 64.6
Photo 64.6
Face 64.6
Door 60.8
Flooring 58.8

Clarifai
created on 2023-10-26

chair 98.8
people 98.6
man 98.3
sitting 96.8
adult 94.7
monochrome 91.8
sit 90.7
actor 89.3
two 88.7
seat 88.1
child 87.8
medicine 84.3
indoors 83.4
person 82.1
boy 79
health 76.7
three 75.5
wheelchair 74.9
furniture 74.2
hospital 73.7

Imagga
created on 2022-01-23

person 29.3
man 26.3
people 23.4
male 22.7
sport 19.6
adult 18.8
black 16.8
portrait 14.2
ball 14.2
player 13.7
exercise 12.7
athlete 12.3
lifestyle 12.3
music 12
casual 11.9
dark 11.7
body 11.2
men 11.2
training 11.1
clothing 11
musical instrument 10.8
fashion 10.5
device 10.5
style 10.4
guitar 10.2
teen 10.1
teenager 10
silhouette 9.9
helmet 9.9
team 9.9
human 9.7
statue 9.7
motion 9.4
youth 9.4
world 9.3
action 9.3
event 9.2
city 9.1
fitness 9
posing 8.9
cool 8.9
art 8.9
stadium 8.8
urban 8.7
boy 8.7
skill 8.7
dancer 8.6
jeans 8.6
balance 8.5
legs 8.5
performer 8.4
chair 8.4
mask 8.2
danger 8.2
pose 8.2
room 8.1
active 8.1
businessman 7.9
women 7.9
play 7.8
grunge 7.7
old 7.7
performance 7.7
studio 7.6
equipment 7.5
leisure 7.5
figure 7.5
holding 7.4
automaton 7.4
competition 7.3
playing 7.3
business 7.3
looking 7.2
teacher 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

wall 98.8
man 94.8
person 92.2
furniture 73.4
text 63.4
chair 60.5
cartoon 52.8
guitar 19.6
bowed instrument 19.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 13-21
Gender Male, 92.1%
Calm 100%
Sad 0%
Happy 0%
Angry 0%
Surprised 0%
Disgusted 0%
Fear 0%
Confused 0%

AWS Rekognition

Age 9-17
Gender Male, 61.8%
Angry 69.5%
Calm 23.6%
Sad 5.2%
Confused 0.7%
Surprised 0.4%
Happy 0.3%
Disgusted 0.2%
Fear 0.2%

AWS Rekognition

Age 30-40
Gender Male, 97.6%
Calm 98.1%
Surprised 0.6%
Sad 0.5%
Angry 0.2%
Confused 0.2%
Fear 0.2%
Happy 0.1%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.6%

Categories

Text analysis

Amazon

8
3
st 3 8
st

Google

st 38 a 8
st
38
a
8