Human Generated Data

Title

Untitled (family playing by haystack)

Date

c. 1950

People

Artist: Mary Lowber Tiers, American 1916 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15862.4

Human Generated Data

Title

Untitled (family playing by haystack)

People

Artist: Mary Lowber Tiers, American 1916 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15862.4

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 99.8
Human 99.8
Person 99
Play Area 95.1
Playground 95.1
Person 90.7
Person 86
Person 83.3
Housing 77.5
Building 77.5
Painting 75.6
Art 75.6
Outdoors 68.9
Tent 60.1
House 57.6
Countryside 56.2
Nature 56.2

Clarifai
created on 2023-10-29

people 99.8
adult 98.2
man 97.4
campsite 97
tent 97
monochrome 95.1
war 93.9
woman 92.5
soldier 92.4
one 92
vintage 91.7
group together 90.5
child 90.2
group 89.6
street 89.3
vehicle 87.2
recreation 86.9
wear 86.8
military 86.8
art 86.2

Imagga
created on 2022-02-05

book jacket 24.1
landscape 21.6
old 20.9
structure 20.1
jacket 18.8
sky 17.2
newspaper 16.3
forest 15.7
grunge 15.3
vintage 14.9
wrapping 14.3
art 13.8
product 12.8
black 12.6
billboard 12.6
wall 12.6
trees 12.4
tree 12.4
covering 11.9
frame 11.7
antique 11.2
ancient 11.2
border 10.9
fence 10.9
snow 10.7
outdoor 10.7
fog 10.6
rural 10.6
texture 10.4
pattern 10.3
signboard 10.2
tool 10.2
man 10.1
creation 10
field 10
natural 10
light 10
dark 10
plow 10
park 9.9
environment 9.9
travel 9.9
mountain 9.8
dirt 9.5
screen 9.5
dirty 9
people 8.9
sun 8.9
paper 8.6
grungy 8.5
winter 8.5
film 8.5
building 8.4
outdoors 8.2
negative 8.2
retro 8.2
paint 8.1
morning 8.1
symbol 8.1
holiday 7.9
architecture 7.8
season 7.8
space 7.8
weathered 7.6
wood 7.5
person 7.5
silhouette 7.4
sport 7.4
ecology 7.3
design 7.3
countryside 7.3
road 7.2
activity 7.2
farm 7.1
hay 7.1
summer 7.1
country 7

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

tree 99.5
text 99.1
black and white 93.5
monochrome 77.9
old 41.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 20-28
Gender Male, 94.4%
Calm 79.8%
Happy 15%
Sad 1.6%
Angry 1.1%
Surprised 1.1%
Disgusted 0.5%
Confused 0.5%
Fear 0.4%

AWS Rekognition

Age 36-44
Gender Male, 97.2%
Calm 96.6%
Confused 1.3%
Sad 0.8%
Surprised 0.5%
Fear 0.3%
Happy 0.2%
Angry 0.2%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Person
Painting
Tent
Person 99.8%
Person 99%
Person 90.7%
Person 86%
Person 83.3%
Painting 75.6%
Tent 60.1%

Categories

Captions

Text analysis

Amazon

SAFETY
KODAK
KODA
KODAK SAFETY FILM
FILM
8

Google

KODAK SAFETY FILM KODA
KODAK
SAFETY
FILM
KODA