Human Generated Data

Title

Untitled (family playing by hay and horses)

Date

c. 1950

People

Artist: Mary Lowber Tiers, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15862

Human Generated Data

Title

Untitled (family playing by hay and horses)

People

Artist: Mary Lowber Tiers, American active 1940s

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 99.8
Human 99.8
Person 99
Playground 95.1
Play Area 95.1
Person 90.7
Person 86
Person 83.3
Building 77.5
Housing 77.5
Painting 75.6
Art 75.6
Outdoors 68.9
Tent 60.1
House 57.6
Nature 56.2
Countryside 56.2

Imagga
created on 2022-02-05

book jacket 24.1
landscape 21.6
old 20.9
structure 20.1
jacket 18.8
sky 17.2
newspaper 16.3
forest 15.7
grunge 15.3
vintage 14.9
wrapping 14.3
art 13.8
product 12.8
black 12.6
billboard 12.6
wall 12.6
trees 12.4
tree 12.4
covering 11.9
frame 11.7
antique 11.2
ancient 11.2
border 10.9
fence 10.9
snow 10.7
outdoor 10.7
fog 10.6
rural 10.6
texture 10.4
pattern 10.3
signboard 10.2
tool 10.2
man 10.1
creation 10
field 10
natural 10
light 10
dark 10
plow 10
park 9.9
environment 9.9
travel 9.9
mountain 9.8
dirt 9.5
screen 9.5
dirty 9
people 8.9
sun 8.9
paper 8.6
grungy 8.5
winter 8.5
film 8.5
building 8.4
outdoors 8.2
negative 8.2
retro 8.2
paint 8.1
morning 8.1
symbol 8.1
holiday 7.9
architecture 7.8
season 7.8
space 7.8
weathered 7.6
wood 7.5
person 7.5
silhouette 7.4
sport 7.4
ecology 7.3
design 7.3
countryside 7.3
road 7.2
activity 7.2
farm 7.1
hay 7.1
summer 7.1
country 7

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

tree 99.5
text 99.1
black and white 93.5
monochrome 77.9
old 41.2

Face analysis

Amazon

Google

AWS Rekognition

Age 20-28
Gender Male, 94.4%
Calm 79.8%
Happy 15%
Sad 1.6%
Angry 1.1%
Surprised 1.1%
Disgusted 0.5%
Confused 0.5%
Fear 0.4%

AWS Rekognition

Age 36-44
Gender Male, 97.2%
Calm 96.6%
Confused 1.3%
Sad 0.8%
Surprised 0.5%
Fear 0.3%
Happy 0.2%
Angry 0.2%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Person 99.8%
Painting 75.6%
Tent 60.1%

Captions

Microsoft

a vintage photo of a bench 78.2%
a vintage photo of a person 77.1%
a vintage photo of an old building 77%

Text analysis

Amazon

SAFETY
KODAK
KODA
KODAK SAFETY FILM
FILM
8

Google

KODAK
FILM
KODA
KODAK SAFETY FILM KODA
SAFETY