Human Generated Data

Title

Untitled (two young girls passing toy bear with baby girl on floor in front of Christmas tree in living room)

Date

1948

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9268

Human Generated Data

Title

Untitled (two young girls passing toy bear with baby girl on floor in front of Christmas tree in living room)

People

Artist: Martin Schweig, American 20th century

Date

1948

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Tree 99.8
Plant 99.8
Person 96.8
Human 96.8
Christmas Tree 96.7
Ornament 96.7
Person 91.1
Person 88.4
Person 70.6
Person 54.3

Imagga
created on 2022-01-23

shop 32.5
shoe shop 24.2
people 24
mercantile establishment 23.6
man 23
room 22.8
person 21.5
table 19.4
salon 17.4
interior 16.8
adult 16.2
place of business 15.6
professional 15.2
home 15.1
male 14.9
teacher 14.6
work 13.5
glass 12.4
medical 11.5
indoors 11.4
happiness 11
happy 10.6
bouquet 10.5
decoration 10.4
instrument 10.3
lifestyle 10.1
indoor 10
blackboard 10
worker 10
team 9.8
modern 9.8
business 9.7
chair 9.6
gift 9.5
party 9.4
girls 9.1
style 8.9
barbershop 8.9
classroom 8.7
smiling 8.7
holiday 8.6
men 8.6
smile 8.5
two 8.5
black 8.4
house 8.3
hand 8.3
event 8.3
wedding 8.3
educator 8
celebration 8
day 7.8
couple 7.8
education 7.8
play 7.7
life 7.7
elegant 7.7
setting 7.7
set 7.6
establishment 7.6
clinic 7.6
fashion 7.5
restaurant 7.3
active 7.2
family 7.1
hairdresser 7.1
working 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 99.5
christmas tree 95
indoor 93.6
house 60.5
family 18.8

Face analysis

Amazon

Google

AWS Rekognition

Age 23-31
Gender Male, 62.3%
Calm 98.1%
Sad 0.7%
Angry 0.5%
Confused 0.3%
Surprised 0.2%
Disgusted 0.1%
Happy 0.1%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.8%

Captions

Microsoft

a group of people in a room 88.2%
a group of people standing in a room 85%
a group of people standing around a table 71.7%

Text analysis

Amazon

a
MJIR
13150
MJIR YE3RAS ACHAA
YE3RAS
ACHAA

Google

YT3RA2
MJI7 YT3RA2 0
0
MJI7