Human Generated Data

Title

Untitled (family in living room and by front steps)

Date

c. 1950

People

Artist: Mary Lowber Tiers, American 1916 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15861

Human Generated Data

Title

Untitled (family in living room and by front steps)

People

Artist: Mary Lowber Tiers, American 1916 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15861

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 99.2
Human 99.2
Face 91.5
Clothing 83.1
Apparel 83.1
Text 77.1
Portrait 66.6
Photography 66.6
Photo 66.6
Helmet 63.6
Door 62.8
Advertisement 59.3
Drawing 56.7
Art 56.7
Poster 53.5
Person 42.8

Clarifai
created on 2023-10-27

people 99.6
child 98.4
family 95.8
man 95.4
adult 93.1
home 93
boy 92.8
one 92.7
monochrome 91.6
wear 90.3
two 89.7
illustration 87
woman 85.5
portrait 85.2
lid 84.1
nostalgia 83
leader 82.1
veil 81.1
canine 80.9
recreation 80.4

Imagga
created on 2022-02-05

man 22.8
people 21.7
person 20.7
male 19.3
adult 18.4
lawn mower 17.4
outdoor 14.5
portrait 14.2
garden tool 14
tool 13
cleaner 12.9
women 12.6
happy 12.5
couple 12.2
groom 12
men 12
love 11.8
happiness 11
child 10.9
dress 10.8
black 10.8
park 10.7
businessman 10.6
outdoors 10.5
hand 9.9
business 9.7
window 9.6
television 9.3
summer 9
human 9
sexy 8.8
smiling 8.7
bride 8.6
device 8.4
old 8.4
fashion 8.3
one 8.2
lady 8.1
building 8
day 7.8
smile 7.8
face 7.8
travel 7.7
door 7.7
outside 7.7
pretty 7.7
attractive 7.7
two 7.6
sport 7.6
wife 7.6
field 7.5
wedding 7.4
danger 7.3
dirty 7.2
lifestyle 7.2
holiday 7.2
sunlight 7.1
family 7.1
sky 7

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 98.9
clothing 85.4
person 79.1
flower 72
footwear 70.6
house 54.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 12-20
Gender Male, 89.3%
Happy 96.8%
Calm 1.9%
Fear 0.4%
Angry 0.3%
Surprised 0.3%
Disgusted 0.1%
Sad 0.1%
Confused 0.1%

AWS Rekognition

Age 28-38
Gender Female, 68.7%
Calm 94.3%
Happy 1.9%
Sad 1.6%
Surprised 1.5%
Angry 0.2%
Disgusted 0.2%
Confused 0.2%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Helmet
Poster
Person 99.2%
Person 42.8%
Helmet 63.6%
Poster 53.5%

Categories

Text analysis

Amazon

SAFETY
KODAK
KODAK SAFETY FILM
ETY
FILM
ETY FILM
A

Google

ETY FILM KODAK S'AFETY FILM
ETY
FILM
KODAK
S'AFETY