Human Generated Data

Title

Untitled (man in studio)

Date

1968

People

Artist: Barbara Norfleet, American 1926 -

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1953

Copyright

© Barbara Norfleet

Human Generated Data

Title

Untitled (man in studio)

People

Artist: Barbara Norfleet, American 1926 -

Date

1968

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Human 99.8
Person 99.8
Tin 84
Can 80.7
Face 72.5
Spray Can 59.3
Beard 58.6
Computer 57.7
Pc 57.7
Electronics 57.7

Imagga
created on 2022-01-08

man 34.9
equipment 29.2
male 26.2
person 25.7
people 25.7
working 23.9
work 20.4
television camera 20.2
professional 19.3
smiling 18.1
indoors 17.6
worker 17.3
machinist 17.1
device 16.8
adult 16.5
television equipment 16.2
lifestyle 15.9
men 15.5
electronic equipment 15.2
job 15
computer 14.6
workshop 13.9
engineer 13.8
occupation 13.7
tool 13.6
sitting 12.9
industry 12.8
repair 12.5
hand 12.2
business 12.1
machine 11.8
garage 11.8
mechanic 11.7
automaton 11.3
color 11.1
service 11.1
smile 10.7
engine 10.6
home 10.4
industrial 10
technician 9.8
portrait 9.7
shop 9.7
technology 9.6
happy 9.4
restaurant 9.4
focus 9.3
looking 8.8
microphone 8.8
concentration 8.7
musician 8.7
skill 8.7
table 8.7
happiness 8.6
profession 8.6
face 8.5
attractive 8.4
car 8.3
confident 8.2
music 8.1
kitchen 8
active 8
microscope 7.9
women 7.9
guy 7.8
education 7.8
guitar 7.8
mechanical 7.8
laboratory 7.7
laptop 7.7
teacher 7.7
research 7.6
tools 7.6
holding 7.4
training 7.4
group 7.3
black 7.2
disk jockey 7.2
school 7.2
steel 7.1
day 7.1

Microsoft
created on 2022-01-08

person 98.7
indoor 96
clothing 94.8
black and white 92.5
man 91.2
musical instrument 88.4
music 85.7
text 84.6
concert 81.3
guitar 78.1

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Female, 70.6%
Sad 91.9%
Calm 3.7%
Fear 1.5%
Disgusted 0.8%
Confused 0.8%
Angry 0.6%
Surprised 0.4%
Happy 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%

Captions

Microsoft

a man standing in a room 86.8%
a man that is standing in a room 84.9%
a man holding a gun 52.2%