Human Generated Data

Title

Untitled (three young girls posed sitting on floor in front of Christmas tree)

Date

1948

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9269

Human Generated Data

Title

Untitled (three young girls posed sitting on floor in front of Christmas tree)

People

Artist: Martin Schweig, American 20th century

Date

1948

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Clothing 96.2
Apparel 96.2
Human 93.9
Person 93.9
Person 93.3
Person 91.4
Play 85
Floor 81.7
Plant 80.4
Tree 80.4
Furniture 75.6
Table 67.9
Lamp 67.1
Child 66
Kid 66
Chair 63.6
Dining Table 60.8
Wood 59.2
Potted Plant 58.1
Vase 58.1
Pottery 58.1
Jar 58.1
Head 58
Indoors 57.6
Living Room 55.4
Room 55.4

Imagga
created on 2022-01-23

people 24.5
person 23.7
man 23.5
black 19.2
adult 16.9
teacher 16.3
sport 15.4
male 14.9
style 14.8
youth 14.5
lifestyle 14.4
salon 14.4
equipment 13.3
room 12.9
music 12.7
active 12.1
play 12.1
men 12
blackboard 11.8
fun 11.2
modern 11.2
grunge 11.1
portrait 11
exercise 10.9
team 10.7
hand 10.6
rock 10.4
world 10.3
women 10.3
player 10
musician 9.9
body 9.6
classroom 9.4
happy 9.4
casual 9.3
art 9.3
performer 9.2
backboard 9.2
city 9.1
dance 9.1
fitness 9
one 9
group 8.9
urban 8.7
education 8.7
musical 8.6
motion 8.6
athlete 8.4
studio 8.4
dark 8.3
action 8.3
silhouette 8.3
human 8.2
professional 8.2
teenager 8.2
girls 8.2
lady 8.1
chair 8
business 7.9
drawing 7.9
happiness 7.8
dancer 7.8
board 7.8
color 7.8
concert 7.8
party 7.7
class 7.7
ball 7.6
fashion 7.5
school 7.5
leisure 7.5
outdoors 7.5
event 7.4
playing 7.3
sexy 7.2
hair 7.1
night 7.1
businessman 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

Face analysis

Amazon

AWS Rekognition

Age 1-7
Gender Male, 63.4%
Calm 91.8%
Sad 7.5%
Fear 0.2%
Angry 0.1%
Disgusted 0.1%
Confused 0.1%
Surprised 0.1%
Happy 0.1%

AWS Rekognition

Age 19-27
Gender Male, 62.2%
Surprised 98.1%
Calm 0.8%
Angry 0.4%
Fear 0.2%
Happy 0.2%
Disgusted 0.1%
Sad 0.1%
Confused 0%

Feature analysis

Amazon

Person 93.9%

Captions

Microsoft

a group of men standing next to a window 44.5%
a group of men standing in front of a window 43.2%
a group of people in a room 43.1%

Text analysis

Amazon

a
YT33A2
MJI YT33A2 00200
00200
MJI
13152