Human Generated Data

Title

Untitled (horse drawn carriage through small beach town)

Date

c. 1960

People

Artist: Mary Lowber Tiers, American 1916 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15859.4

Human Generated Data

Title

Untitled (horse drawn carriage through small beach town)

People

Artist: Mary Lowber Tiers, American 1916 - 1985

Date

c. 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15859.4

Machine Generated Data

Tags

Amazon
created on 2022-03-25

Person 99.4
Human 99.4
Horse 97.3
Animal 97.3
Mammal 97.3
Nature 94.4
Car 94.1
Transportation 94.1
Vehicle 94.1
Automobile 94.1
Tree 85.7
Plant 85.7
Weather 84.4
Palm Tree 83.7
Arecaceae 83.7
Person 83.1
Building 82.8
Outdoors 81.8
Urban 77.9
City 77.1
Town 77.1
Architecture 70
Neighborhood 69.4
Tower 67.5
Wheel 66
Machine 66
Spire 64.6
Steeple 64.6
LCD Screen 62.1
Electronics 62.1
Screen 62.1
Monitor 62.1
Display 62.1
Downtown 57.8
Housing 57.4
Wheel 57.1

Clarifai
created on 2023-10-29

monochrome 99.7
people 99.5
street 99.5
man 93.7
beach 91.5
dawn 91.4
city 91.2
vehicle 90.1
adult 88.5
tree 88
black and white 86.3
road 86.1
square 85.7
transportation system 85.7
analogue 85.3
group together 85
group 85
landscape 84.9
no person 84.8
travel 84.2

Imagga
created on 2022-03-25

billboard 65.1
signboard 52.9
structure 48
sky 29.3
snow 28.1
city 25
architecture 21
tree 20.8
travel 20.4
landscape 19.3
winter 17
trees 16.9
building 16.5
weather 16.4
urban 14.9
water 13.4
season 13.3
windowsill 12.2
cold 12.1
old 11.8
clouds 11.8
road 11.7
sunset 11.7
beach 11
transport 11
sun 10.7
night 10.7
office 10.4
scene 10.4
ocean 10.1
park 10
holiday 10
screen 9.9
silhouette 9.9
tourism 9.9
transportation 9.9
sill 9.8
university 9.7
fog 9.7
sliding door 9.6
cloud 9.5
buildings 9.5
grunge 9.4
light 9.4
town 9.3
business 9.1
vintage 9.1
black 9
scenery 9
tower 9
forest 8.7
window 8.7
glass 8.6
construction 8.6
space 8.5
sunrise 8.4
destination 8.4
modern 8.4
landmark 8.1
coast 8.1
river 8
antique 7.8
outdoor 7.6
monitor 7.6
frame 7.5
palm 7.5
outdoors 7.5
reflection 7.4
vacation 7.4
street 7.4
door 7.4
structural member 7.3
lake 7.3
day 7.1
sea 7
scenic 7

Google
created on 2022-03-25

Microsoft
created on 2022-03-25

text 98.3
tree 92.4
sky 81.3
house 74.6
black and white 60.5
image 36.4
sign 19.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 18-24
Gender Male, 89.1%
Calm 38.7%
Sad 28.8%
Confused 10.5%
Disgusted 7.1%
Angry 6.1%
Fear 3.1%
Surprised 3.1%
Happy 2.5%

AWS Rekognition

Age 48-54
Gender Male, 95.3%
Calm 75.1%
Sad 12.6%
Confused 5%
Happy 2.2%
Disgusted 2.1%
Fear 1%
Surprised 1%
Angry 1%

Feature analysis

Amazon

Person
Horse
Car
Wheel
Person 99.4%
Person 83.1%
Horse 97.3%
Car 94.1%
Wheel 66%
Wheel 57.1%

Captions

Microsoft
created on 2022-03-25

a view of a city 88.4%
a view of a city street 82.8%
a sign in front of a window 71.1%

Text analysis

Amazon

KODYK
G 10
INFORMATION
INFORMATION GIFTS
146
GIFTS
tirn
J.M.
XAGOX
COTELA tirn
الد YT3RAS XAGOX
20GE
الد
65%
YT3RAS
LOR 20GE
COTELA
LOR

Google

196 AYT33 A*2XAGON MJI7 Y T37 A2 XAGOX
196
AYT33
A*2XAGON
MJI7
Y
T37
A2
XAGOX