Human Generated Data

Title

Untitled (family around fireplace)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17595

Human Generated Data

Title

Untitled (family around fireplace)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.3
Human 99.3
Person 98.8
Furniture 95.3
Chair 95.3
Person 94.3
Sitting 91.8
Clothing 91.3
Apparel 91.3
Female 87.4
Living Room 81.3
Indoors 81.3
Room 81.3
Woman 69.3
Meal 68.6
Food 68.6
Couch 68.4
Girl 67.5
People 66.9
Dish 65
Text 57.5

Imagga
created on 2022-02-26

man 28.2
person 24
people 22.8
indoors 21.9
teacher 20.6
room 20
interior 19.4
male 19.1
adult 16.7
sitting 16.3
home 15.9
shop 15.9
men 15.4
salon 15
work 14.1
classroom 14
smiling 13.7
lifestyle 13
cheerful 13
chair 13
washboard 12.7
house 12.5
device 12.4
musical instrument 12.3
couple 12.2
indoor 11.9
women 11.9
educator 11.4
product 11.2
business 10.9
kitchen 10.8
professional 10.7
table 10.6
modern 10.5
old 10.4
portrait 10.3
happiness 10.2
happy 10
holding 9.9
wicker 9.6
luxury 9.4
wall 9.4
architecture 9.4
two 9.3
office 9.2
newspaper 9.2
class 8.7
education 8.6
worker 8.5
senior 8.4
communication 8.4
pretty 8.4
window 8.4
fashion 8.3
creation 8.3
shopping 8.2
decoration 8
businessman 7.9
mercantile establishment 7.7
attractive 7.7
youth 7.7
bakery 7.6
elegance 7.5
togetherness 7.5
building 7.4
phone 7.4
light 7.3
design 7.3
smile 7.1
family 7.1
furniture 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 92.4
indoor 86.6
black and white 83.8
clothing 83.8
person 82.5
wedding dress 75.7
woman 62.3
sink 56.9

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Female, 72.6%
Calm 86.4%
Happy 12.4%
Sad 0.4%
Disgusted 0.3%
Confused 0.2%
Angry 0.1%
Surprised 0.1%
Fear 0.1%

AWS Rekognition

Age 28-38
Gender Female, 54.3%
Happy 77.7%
Calm 17.6%
Surprised 3.3%
Sad 0.9%
Disgusted 0.3%
Confused 0.2%
Fear 0.1%
Angry 0%

AWS Rekognition

Age 14-22
Gender Male, 88%
Angry 31.4%
Calm 27.9%
Sad 11.4%
Fear 8.6%
Confused 8.2%
Happy 5.8%
Disgusted 4.1%
Surprised 2.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Chair 95.3%

Captions

Microsoft

a person sitting on display in front of a window 75.7%
a person sitting in front of a window 66.9%
a person sitting at a table in front of a window 66.8%

Text analysis

Amazon

Kuehn
Kuehn Korn
Korn
KODAK---1TW

Google

YT37A°2--AGO
Korn
MJI7--
Korn MJI7-- YT37A°2--AGO