Human Generated Data

Title

Untitled (two seated elderly women)

Date

1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15574.2

Human Generated Data

Title

Untitled (two seated elderly women)

People

Artist: Jack Gould, American

Date

1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Human 99.3
Person 99.3
Person 98.5
Clothing 93.8
Apparel 93.8
Furniture 82.5
Hat 75.6
Hat 69.6
Face 64.9
Photography 64.9
Photo 64.9
Portrait 64.9
People 64.8
Chair 59.1
Sitting 56

Imagga
created on 2022-02-05

man 39.6
male 36.1
person 33.3
professional 29.6
office 27.7
people 26.8
businessman 25.6
worker 25.1
business 23.7
working 22.1
work 22
occupation 21.1
adult 19.9
job 19.4
medical 19.4
computer 18.5
laptop 17.5
doctor 16.9
uniform 16.5
medicine 15.8
clothing 15.6
manager 14.9
portrait 14.9
corporate 14.6
men 14.6
smiling 13.7
technology 13.3
suit 13.2
nurse 13
looking 12.8
laboratory 12.5
sitting 12
health 11.8
desk 11.7
lab 11.7
team 11.6
hospital 11.5
patient 11.3
success 11.3
successful 11
indoors 10.5
human 10.5
serious 10.5
equipment 10.3
industry 10.2
specialist 10.1
happy 10
engineer 9.8
profession 9.6
home 9.6
education 9.5
businesspeople 9.5
women 9.5
construction 9.4
mature 9.3
smile 9.3
face 9.2
student 9.2
alone 9.1
modern 9.1
military uniform 9.1
care 9
executive 9
instrument 8.9
building 8.9
surgeon 8.8
chemical 8.7
covering 8.7
sax 8.6
research 8.6
development 8.5
tie 8.5
meeting 8.5
teamwork 8.3
safety 8.3
protection 8.2
group 8.1
helmet 7.9
look 7.9
bright 7.9
scientist 7.8
black 7.8
chemistry 7.7
architect 7.7
room 7.7
test 7.7
collar 7.7
window 7.7
boss 7.6
illness 7.6
workplace 7.6
biology 7.6
career 7.6
coat 7.4
confident 7.3
clinic 7.2
science 7.1
builder 7.1
architecture 7

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 99.2
person 95.6
clothing 95.4
window 88.6
human face 88.2
black and white 83.5
fashion accessory 78
man 64.3
hat 52.1

Face analysis

Amazon

Google

AWS Rekognition

Age 49-57
Gender Male, 85.3%
Calm 99.9%
Surprised 0.1%
Sad 0%
Disgusted 0%
Angry 0%
Confused 0%
Happy 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Hat 75.6%

Captions

Microsoft

a man sitting in front of a window 73.6%
a man sitting next to a window 69.2%
a man sitting in a window 69.1%