Human Generated Data

Title

Untitled (woman standing with two elderly people in rocking chairs)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14779

Human Generated Data

Title

Untitled (woman standing with two elderly people in rocking chairs)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Human 99
Person 99
Furniture 97.4
Person 95.1
Clothing 87.3
Apparel 87.3
Person 86.2
Living Room 76.9
Room 76.9
Indoors 76.9
Face 75.2
People 71.4
Female 70.9
Kid 65.6
Child 65.6
Girl 65
Portrait 63.1
Photography 63.1
Photo 63.1
Baby 62.4
Guitar 56.9
Leisure Activities 56.9
Musical Instrument 56.9
Woman 56.2
Mosquito Net 55.9
Toy 55.6

Imagga
created on 2022-01-29

barbershop 85.9
shop 75.3
mercantile establishment 53.7
place of business 35.9
home 31.1
people 29.6
interior 28.3
man 24.9
room 23
chair 22.7
person 22.6
indoors 22
adult 21.4
medical 19.4
hairdresser 18.8
barber chair 18.4
establishment 18
male 17.7
happy 16.9
work 16.6
kitchen 16.4
smiling 15.9
house 15.9
office 14.7
indoor 14.6
worker 14.5
seat 14
lifestyle 13.7
working 13.3
cheerful 13
smile 12.8
professional 12.7
hospital 12.5
patient 12.4
medicine 12.3
men 12
inside 12
furniture 11.6
doctor 11.3
health 11.1
women 11.1
portrait 11
modern 10.5
clinic 10.2
salon 10.1
appliance 9.9
pretty 9.8
family 9.8
laboratory 9.6
research 9.5
happiness 9.4
restaurant 9.3
casual 9.3
20s 9.2
holding 9.1
one 9
lady 8.9
table 8.8
standing 8.7
waiter 8.6
glass 8.6
business 8.5
food 8.5
plate 8.5
clothing 8.4
clothes 8.4
fashion 8.3
equipment 8.2
technology 8.2
newspaper 8.1
hand blower 8.1
to 8
job 8
dinner 7.7
profession 7.7
illness 7.6
two 7.6
talking 7.6
enjoying 7.6
senior 7.5
human 7.5
instrument 7.4
mother 7.4
mature 7.4
window 7.3
computer 7.2
team 7.2
employee 7.1

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

table 94.5
indoor 92.6
black and white 90.1
furniture 77.3
text 64.3
person 60.9
clothing 53.8
messy 42.5
cluttered 10.8

Face analysis

Amazon

Google

AWS Rekognition

Age 51-59
Gender Female, 53.9%
Calm 91.5%
Happy 5.9%
Sad 1.2%
Confused 0.6%
Surprised 0.2%
Disgusted 0.2%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 37-45
Gender Male, 95.5%
Calm 100%
Sad 0%
Happy 0%
Surprised 0%
Disgusted 0%
Confused 0%
Fear 0%
Angry 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%
Guitar 56.9%

Captions

Microsoft

a person standing in front of a refrigerator 51.6%
a person standing on top of a refrigerator 39%
a person standing in front of a refrigerator 38.9%

Text analysis

Google

ASMALL
ASMALL