Human Generated Data

Title

Untitled (woman demonstrating iron)

Date

c. 1950

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19106

Human Generated Data

Title

Untitled (woman demonstrating iron)

People

Artist: Robert Burian, American active 1940s-1950s

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.1
Human 99.1
Apparel 80.6
Clothing 80.6
Photo 60.8
Photography 60.8
Sleeve 56.2
Furniture 55.2
Chair 55.2

Imagga
created on 2022-03-05

laptop 43.5
computer 38.1
person 33.6
adult 33.3
sitting 32.6
people 29.6
business 29.2
office 28.3
working 28.3
work 27.5
home 26.3
smile 25
smiling 24.6
happy 24.4
notebook 23.5
attractive 23.1
businesswoman 22.7
pretty 21
technology 20.8
lifestyle 20.2
indoors 20.2
worker 19.6
portrait 19.4
professional 19.1
women 19
device 18.8
casual 18.6
lady 17.9
job 17.7
desk 16.5
chair 15.8
interior 15
duplicator 15
indoor 14.6
printer 14.6
student 14.5
success 14.5
secretary 14.5
patient 14.4
table 14.2
modern 13.3
cheerful 13
apparatus 12.9
suit 12.6
sofa 12.6
wireless 12.4
looking 12
confident 11.8
happiness 11.8
machine 11.6
room 11.5
one 11.2
manager 11.2
corporate 11.2
executive 11.1
sick person 11
model 10.9
relaxation 10.9
house 10.9
case 10.8
male 10.7
couch 10.6
using 10.6
education 10.4
phone 10.1
relax 10.1
seat 10
sexy 9.6
equipment 9.3
communication 9.2
20s 9.2
leisure 9.1
relaxing 9.1
holding 9.1
gramophone 9
clothing 8.9
photocopier 8.8
man 8.7
hair 8.7
brunette 8.7
businesspeople 8.5
blond 8.5
salon 8.5
mature 8.4
furniture 8.3
monitor 8.3
cute 7.9
black 7.8
scanner 7.7
youth 7.7
workplace 7.6
reading 7.6
fashion 7.5
study 7.5
inside 7.4
successful 7.3
peripheral 7.3
shower cap 7.3
record player 7.2
facsimile 7
look 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

wall 99.3
text 98.6
person 88.9
black and white 88.2
piano 74.6
smile 52.7
computer 30.5

Face analysis

Amazon

AWS Rekognition

Age 37-45
Gender Female, 98.3%
Happy 51%
Surprised 42.8%
Calm 4.7%
Fear 0.8%
Disgusted 0.2%
Angry 0.2%
Confused 0.1%
Sad 0.1%

Feature analysis

Amazon

Person 99.1%

Captions

Microsoft

a person posing for the camera 83.2%
a person standing in front of a computer 56.5%
a person posing for a photo 56.4%

Text analysis

Amazon

021
YT37A*2-XAOOX

Google

YT37A°2-XAGOX
YT37A°2-XAGOX