Human Generated Data

Title

Untitled (two young girls passing toy bear with baby girl on floor in front of Christmas tree in living room)

Date

1948

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9268

Human Generated Data

Title

Untitled (two young girls passing toy bear with baby girl on floor in front of Christmas tree in living room)

People

Artist: Martin Schweig, American 20th century

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9268

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Tree 99.8
Plant 99.8
Person 96.8
Human 96.8
Christmas Tree 96.7
Ornament 96.7
Person 91.1
Person 88.4
Person 70.6
Person 54.3

Clarifai
created on 2023-10-27

people 99.6
room 98.9
monochrome 98.9
furniture 98.6
indoors 98.6
group 96.8
family 93.8
many 93.7
several 92.9
child 91.8
adult 91.4
home 91.3
interior design 90.7
administration 90
woman 89.2
wear 88.1
man 86.3
war 85.8
chair 85.3
group together 85

Imagga
created on 2022-01-23

shop 32.5
shoe shop 24.2
people 24
mercantile establishment 23.6
man 23
room 22.8
person 21.5
table 19.4
salon 17.4
interior 16.8
adult 16.2
place of business 15.6
professional 15.2
home 15.1
male 14.9
teacher 14.6
work 13.5
glass 12.4
medical 11.5
indoors 11.4
happiness 11
happy 10.6
bouquet 10.5
decoration 10.4
instrument 10.3
lifestyle 10.1
indoor 10
blackboard 10
worker 10
team 9.8
modern 9.8
business 9.7
chair 9.6
gift 9.5
party 9.4
girls 9.1
style 8.9
barbershop 8.9
classroom 8.7
smiling 8.7
holiday 8.6
men 8.6
smile 8.5
two 8.5
black 8.4
house 8.3
hand 8.3
event 8.3
wedding 8.3
educator 8
celebration 8
day 7.8
couple 7.8
education 7.8
play 7.7
life 7.7
elegant 7.7
setting 7.7
set 7.6
establishment 7.6
clinic 7.6
fashion 7.5
restaurant 7.3
active 7.2
family 7.1
hairdresser 7.1
working 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 99.5
christmas tree 95
indoor 93.6
house 60.5
family 18.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-31
Gender Male, 62.3%
Calm 98.1%
Sad 0.7%
Angry 0.5%
Confused 0.3%
Surprised 0.2%
Disgusted 0.1%
Happy 0.1%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.8%

Text analysis

Amazon

a
MJIR
13150
MJIR YE3RAS ACHAA
YE3RAS
ACHAA

Google

0
MJI7 YT3RA2 0
MJI7
YT3RA2