Human Generated Data

Title

Untitled (woman in chair, reading)

Date

c. 1945

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19155

Human Generated Data

Title

Untitled (woman in chair, reading)

People

Artist: Robert Burian, American active 1940s-1950s

Date

c. 1945

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Furniture 100
Cabinet 94.9
Human 94.8
Person 94.8
Couch 92.5
Room 89.3
Living Room 89.3
Indoors 89.3
Interior Design 86.6
Clothing 83.2
Apparel 83.2
Electronics 69.4
Screen 69.4
Display 69.4
Monitor 69.4
Face 68.2
Bed 67.7
Bedroom 67.1
Skin 67.1
LCD Screen 64.5
Portrait 63.2
Photography 63.2
Photo 63.2
Flooring 62.6
Table 60.4
Dresser 60
Drawer 59.7
Shelf 58.1
Chair 55.3

Imagga
created on 2022-03-05

person 33.4
chair 33
home 31.9
room 27.3
man 24.8
people 24
adult 22.1
indoors 20.2
house 20
interior 19.4
furniture 18.7
male 17.7
computer 17.1
sitting 16.3
lifestyle 15.9
kitchen 15.2
modern 14.7
indoor 14.6
table 14.4
working 14.1
seat 13.5
business 13.4
barrow 13.3
happy 13.1
office 13
appliance 12.6
laptop 12.5
handcart 12.3
hospital 12.2
floor 12.1
home appliance 12
device 11.6
patient 11.3
senior 11.2
old 11.1
work 11
elderly 10.5
men 10.3
iron lung 10
decor 9.7
medical 9.7
businessman 9.7
looking 9.6
wheeled vehicle 9.5
day 9.4
sewing machine 9.3
machine 9.2
alone 9.1
domestic 9
health 9
technology 8.9
couple 8.7
wall 8.5
design 8.4
relax 8.4
scholar 8.3
teacher 8.2
care 8.2
one 8.2
respirator 8
smiling 8
smile 7.8
musical instrument 7.7
sit 7.6
wood 7.5
equipment 7.5
clinic 7.5
outdoors 7.5
textile machine 7.4
mature 7.4
vehicle 7.4
inside 7.4
black 7.2
bench 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 97.9
furniture 93.2
black and white 67.1
computer 56.4

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Female, 87.6%
Calm 99.3%
Surprised 0.6%
Sad 0%
Fear 0%
Confused 0%
Disgusted 0%
Angry 0%
Happy 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 94.8%
Chair 55.3%

Captions

Microsoft

a person sitting in a room 48.6%

Text analysis

Amazon

is