Human Generated Data

Title

Untitled (Hood's dairy trade fair display)

Date

c. 1939, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5858

Human Generated Data

Title

Untitled (Hood's dairy trade fair display)

People

Artist: Durette Studio, American 20th century

Date

c. 1939, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5858

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 99.3
Human 99.3
Person 98.8
Person 98.7
Worker 76.6
Restaurant 75
Cafeteria 69.3
Pub 60
Chef 57.4
Food 55.7
Meal 55.7

Clarifai
created on 2019-11-16

people 99.9
monochrome 99.6
adult 98.9
man 97.6
group together 97.4
administration 95.9
furniture 95.8
two 95.4
group 95
several 94.5
four 94.3
woman 93.5
one 93.3
indoors 91
three 90.9
vehicle 89.7
military 89.4
sit 88.6
leader 87.2
wear 86.4

Imagga
created on 2019-11-16

counter 52.7
man 26.9
bartender 21.8
male 21.3
people 21.2
person 20.7
desk 20.2
adult 18.6
table 17.2
office 16.8
professional 16.3
furniture 16.3
home 15.9
chair 15.7
shop 15
barbershop 14.9
computer 14.4
interior 14.1
working 14.1
business 14
room 13.6
happy 13.1
men 12.9
house 12.5
work 11.8
indoor 10.9
holding 10.7
job 10.6
education 10.4
student 10
modern 9.8
equipment 9.8
indoors 9.7
executive 9.7
blackboard 9.6
looking 9.6
home appliance 9.6
teacher 9.6
kitchen 9.5
smiling 9.4
smile 9.3
portrait 9
black 9
cheerful 8.9
mercantile establishment 8.9
technology 8.9
businessman 8.8
machine 8.8
sewing machine 8.7
couple 8.7
sitting 8.6
floor 8.4
studio 8.3
device 8.3
appliance 8.2
to 8
classroom 7.9
worker 7.9
furnishing 7.8
pretty 7.7
restaurant 7.6
occupation 7.3
laptop 7.3
copy space 7.2
school 7.2
hair 7.1
women 7.1
medical 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 99
person 98.2
clothing 95.3
man 94.6
black and white 84.3
black 82.3
table 76.1
restaurant 69.8
human face 64.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 34-50
Gender Male, 54.8%
Disgusted 45%
Sad 45%
Confused 45%
Happy 54.8%
Fear 45%
Surprised 45.1%
Calm 45%
Angry 45%

AWS Rekognition

Age 23-37
Gender Male, 54.8%
Angry 45%
Happy 54.9%
Disgusted 45%
Calm 45.1%
Fear 45%
Surprised 45%
Sad 45%
Confused 45%

AWS Rekognition

Age 23-37
Gender Female, 54.5%
Happy 48.2%
Angry 46.3%
Disgusted 46.5%
Calm 47%
Fear 45.4%
Surprised 45.3%
Confused 45.4%
Sad 45.9%

Microsoft Cognitive Services

Age 38
Gender Male

Microsoft Cognitive Services

Age 35
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%

Text analysis

Amazon

TO
5
GOOD
HEALTA
HOOD
10
H
HOODS
THE
THE OUENY DOOR TO
Orange
MILK
OUENY
DOOR
CHOOLAE HOODS
SONS
H P HOOD GOOD C HEALTA
CHOOLAE
LOUN ASN
nANCHESTEFRRERN
nANCHESTEFRRERN GMIGE LOUN ASN
P
GMIGE
Modin
MILK I
ISTBTCS
ne ISTBTCS
C
ne
I

Google

DOOR
TO
FEDEDAL
SAVS
ISTRNS
HOODS
Orange
ove
10
THE OPEN DOOR TO GOOD HEALTH MADCHESTER FEDEDAL SAVS LOAN ASSN EAS ISTRNS HOODS OHOCCLAI 5¢ Way Orange ove 10 m
THE
OPEN
GOOD
HEALTH
MADCHESTER
LOAN
ASSN
EAS
OHOCCLAI
Way
m