Human Generated Data

Title

Untitled (men at manufacturing equipment)

Date

c.1953

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18190

Human Generated Data

Title

Untitled (men at manufacturing equipment)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

c.1953

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Workshop 99.9
Person 99.4
Human 99.4
Person 99.2
Building 94.4
Person 92.6
Factory 91.5
Machine 80.4
Assembly Line 56.8

Imagga
created on 2022-03-04

barbershop 47.1
shop 44.8
chair 40.3
barber chair 30.2
mercantile establishment 29.5
interior 29.2
room 27.8
furniture 24.3
seat 22.7
man 21.5
place of business 20.4
people 19.5
industry 18.8
table 18.5
indoors 18.4
work 18
machine 17.7
equipment 17.6
modern 17.5
inside 17.5
working 15.9
hairdresser 15.9
business 15.8
hospital 15.5
health 13.9
male 13.5
office 13.3
floor 13
person 12.7
industrial 12.7
job 12.4
adult 12.4
medical 12.4
lifestyle 12.3
restaurant 12.1
worker 11.8
medicine 11.4
light 11.4
salon 11.3
professional 11.1
indoor 11
factory 10.6
device 10.6
patient 10.5
computer 10.4
establishment 10.3
men 10.3
clinic 10.2
architecture 10.2
occupation 10.1
decor 9.7
metal 9.7
design 9
desk 9
technology 8.9
home 8.8
glass 8.6
doctor 8.5
center 8.4
building 8.4
house 8.4
kitchen 8.2
machinery 8.1
station 7.7
monitor 7.7
lamp 7.6
horizontal 7.5
bar 7.4
back 7.3
window 7.3
life 7.3
workshop 7.2
furnishing 7.2
counter 7.1
steel 7.1

Google
created on 2022-03-04

Microsoft
created on 2022-03-04

text 93.9
black and white 86.9
drawing 85.3
person 80.6
clothing 78.5
sketch 75.2
man 57.7

Face analysis

Amazon

Google

AWS Rekognition

Age 47-53
Gender Male, 99.9%
Happy 69.8%
Surprised 8.6%
Confused 6.8%
Sad 5%
Angry 4.8%
Calm 2.7%
Fear 1.6%
Disgusted 0.9%

AWS Rekognition

Age 41-49
Gender Male, 100%
Calm 73.1%
Happy 20.9%
Confused 2.8%
Disgusted 1.5%
Surprised 0.6%
Angry 0.5%
Sad 0.3%
Fear 0.3%

AWS Rekognition

Age 31-41
Gender Male, 96.6%
Sad 99.6%
Calm 0.1%
Confused 0.1%
Surprised 0.1%
Disgusted 0%
Angry 0%
Fear 0%
Happy 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Captions

Microsoft

a group of people sitting in front of a building 51.6%
a person sitting in front of a building 45.8%

Text analysis

Amazon

KODVK-SVLELA

Google

FAr
FAr YT37A2-MAGON
YT37A2-MAGON