Human Generated Data

Title

Untitled (woman sitting on chair)

Date

c. 1950

People

Artist: Boston Herald,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19430

Human Generated Data

Title

Untitled (woman sitting on chair)

People

Artist: Boston Herald,

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Furniture 100
Human 97.6
Person 97.6
Chair 95.2
Clothing 89.4
Shorts 89.4
Apparel 89.4
Person 87.8
Sitting 81.9
Flooring 63.8
Face 62
Photography 62
Portrait 62
Photo 62
Floor 61.4

Imagga
created on 2022-03-05

people 26.2
adult 24.9
person 24.7
fashion 21.9
man 20.8
street 20.2
dress 19
seat 18.2
pretty 18.2
chair 17.7
attractive 17.5
portrait 17.5
lifestyle 16.6
sexy 16.1
sitting 15.5
summer 15.4
legs 15.1
lady 14.6
outdoor 14.5
happy 14.4
alone 13.7
outdoors 13.6
male 13.6
leg 13.3
model 12.4
building 12.4
wall 12.2
style 11.9
elegance 11.8
city 11.6
crutch 11
day 11
sensuality 10.9
black 10.8
leisure 10.8
posing 10.7
indoors 10.5
urban 10.5
couple 10.5
body 10.4
men 10.3
women 10.3
happiness 10.2
life 10
smile 10
support 9.9
human 9.7
staff 9.5
elegant 9.4
youth 9.4
cute 9.3
casual 9.3
sand 9.2
hand 9.1
sidewalk 9.1
device 8.9
lovely 8.9
sun 8.9
standing 8.7
smiling 8.7
business 8.5
modern 8.4
vacation 8.2
exercise 8.2
active 8.1
water 8
home 8
hair 7.9
work 7.9
boy 7.8
travel 7.7
outside 7.7
old 7.7
one person 7.5
one 7.5
action 7.4
park 7.4
full 7.3
pose 7.2

Google
created on 2022-03-05

Black 89.6
Black-and-white 84.2
Style 83.8
Knee 82
Elbow 81.5
Flash photography 79.8
Flooring 79
Tints and shades 77.3
Monochrome 74.3
Human leg 74.1
Monochrome photography 73.9
Barefoot 68.9
Balance 68.4
Shorts 67.8
Waist 66.9
Sitting 64.6
Room 63.9
Recreation 61.9
T-shirt 61.4
Art 60.8

Microsoft
created on 2022-03-05

black and white 91.4
footwear 90.5
person 87.8
text 85.6
street 70.4
clothing 66.6
monochrome 62.6
man 60.5
arm 15.2

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Male, 92.1%
Happy 98.4%
Calm 0.8%
Sad 0.2%
Surprised 0.2%
Disgusted 0.1%
Fear 0.1%
Angry 0.1%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.6%
Chair 95.2%

Captions

Microsoft

a man standing in front of a building 80.8%
a man standing next to a building 78.7%
a man standing on top of a building 71.6%

Text analysis

Amazon

ES
to
HAMT2A3
MAOOX -YY33A2 HAMT2A3
-YY33A2
MAOOX

Google

NAGON
NAGON