Human Generated Data

Title

Untitled (woman sitting on chair in bedroom)

Date

c. 1950

People

Artist: Boston Herald,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19459

Human Generated Data

Title

Untitled (woman sitting on chair in bedroom)

People

Artist: Boston Herald,

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Furniture 100
Human 97.6
Person 97.6
Apparel 90.1
Clothing 90.1
Cabinet 82.6
Room 80.3
Indoors 80.3
Table 67.5
Person 63.7
Chair 63.2
Female 61.2
Bedroom 58
Dresser 57.5
Bed 57.3

Imagga
created on 2022-03-05

chair 49.1
man 26.2
person 25.1
sitting 24.9
adult 23.9
rocking chair 23.5
seat 23
people 20.6
furniture 19.2
male 17
teacher 16.2
lifestyle 15.9
laptop 15
black 13.9
room 13.4
sexy 12.8
professional 12.6
table 12.4
indoors 12.3
couple 12.2
business 12.1
attractive 11.9
women 11.9
educator 11.7
computer 11.7
relaxation 11.7
device 11.6
interior 11.5
cheerful 11.4
lady 11.4
men 11.2
smiling 10.8
office 10.6
fashion 10.5
home 10.4
portrait 10.4
model 10.1
working 9.7
one 9.7
happy 9.4
casual 9.3
two 9.3
relax 9.3
elegance 9.2
technology 8.9
hair 8.7
love 8.7
pretty 8.4
fun 8.2
indoor 8.2
relaxing 8.2
work 8
job 8
businessman 7.9
smile 7.8
happiness 7.8
travel 7.7
musical instrument 7.7
sit 7.6
communication 7.6
house 7.5
human 7.5
vintage 7.4
style 7.4
furnishing 7.3
alone 7.3
dress 7.2
body 7.2
face 7.1
modern 7

Google
created on 2022-03-05

Furniture 95
White 92.2
Chair 91.1
Cabinetry 90.6
Picture frame 90.2
Black 89.6
Table 88.3
Drawer 87.2
Black-and-white 86.6
Style 84.1
Art 81.6
Monochrome 81.4
Monochrome photography 78.5
Chest of drawers 77.3
Stool 75
Room 74.4
Vintage clothing 73.9
Dresser 69.5
Sitting 66.7
Lamp 66.3

Microsoft
created on 2022-03-05

indoor 89.7
text 88
black and white 84
furniture 73.2
chair 66.3
person 66.1
clothing 58.1

Face analysis

Amazon

Google

AWS Rekognition

Age 20-28
Gender Male, 99.6%
Happy 59.4%
Calm 24.6%
Surprised 12.3%
Disgusted 1.2%
Confused 0.8%
Fear 0.8%
Sad 0.5%
Angry 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.6%
Chair 63.2%

Captions

Microsoft

a person sitting next to a fireplace 82.3%
a person sitting in a room 82.2%
a person sitting in front of a fireplace 82.1%

Text analysis

Amazon

S3

Google

ES
NAGON-
ES NAGON- YT3RA2-NAMTZA3
YT3RA2-NAMTZA3