Human Generated Data

Title

Untitled (couple on double bicycle)

Date

c.1950

People

Artist: Mary Lowber Tiers, American 1916 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15897.2

Human Generated Data

Title

Untitled (couple on double bicycle)

People

Artist: Mary Lowber Tiers, American 1916 - 1985

Date

c.1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15897.2

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Wheel 98.6
Machine 98.6
Person 98
Human 98
Wheel 97.3
Person 89.2
Transportation 89.1
Vehicle 89
Bike 87.1
Bicycle 85.4
Smoke 83.7
Cyclist 67.5
Sport 67.5
Sports 67.5
Nature 63
Spoke 55.9
Person 44.2

Clarifai
created on 2023-10-29

vehicle 99.5
people 99.2
seated 98.9
transportation system 98.1
group together 95.6
many 94.9
group 94.4
monochrome 94.2
cavalry 94
smoke 94
adult 91.4
bike 91.3
war 88.8
biker 88.2
no person 86.8
military 85.9
man 83.6
several 83.4
aircraft 82.6
tree 82.6

Imagga
created on 2022-02-05

wheeled vehicle 32.2
tricycle 27
sky 21
newspaper 19.7
vehicle 18.9
landscape 16.4
old 16
conveyance 15.9
silhouette 15.7
fence 15.3
product 15.3
grunge 14.5
outdoor 13.8
shopping cart 13.6
structure 13.2
people 12.8
man 12.8
snow 12.4
person 12.2
creation 11.9
building 11.9
sport 11.8
picket fence 11.8
black 11.4
outdoors 10.9
architecture 10.9
handcart 10.9
sunset 10.8
travel 10.6
cloud 10.3
retro 9.8
cold 9.5
male 9.3
tree 9.2
world 9.2
vintage 9.1
summer 9
vacation 9
water 8.7
dusk 8.6
winter 8.5
frame 8.5
power 8.4
color 8.3
beach 8.1
barrier 8.1
transportation 8.1
business 7.9
scene 7.8
antique 7.8
chair 7.8
men 7.7
outside 7.7
sign 7.5
pattern 7.5
house 7.5
evening 7.5
symbol 7.4
spectator 7.4
bicycle 7.3
transport 7.3
dirty 7.2
recreation 7.2
activity 7.2
history 7.2
day 7.1
rural 7
sea 7
wheelchair 7

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 96.8
fog 94
black 87.1
outdoor 85.5
bicycle 82.4
black and white 74.5
white 74
old 59
vehicle 57.7
sky 54.9
picture frame 7.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 19-27
Gender Female, 96.4%
Disgusted 53.6%
Happy 30.4%
Calm 8.8%
Sad 4.9%
Confused 0.9%
Angry 0.8%
Surprised 0.4%
Fear 0.3%

AWS Rekognition

Age 11-19
Gender Female, 98.9%
Calm 91.6%
Happy 3.2%
Sad 2%
Fear 1.3%
Angry 1%
Disgusted 0.3%
Confused 0.3%
Surprised 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Wheel
Person
Bicycle
Wheel 98.6%
Wheel 97.3%
Person 98%
Person 89.2%
Person 44.2%
Bicycle 85.4%

Text analysis

Amazon

KODAK
3
FILM
STOP
KODAK SAFETY
SAFETY

Google

3. KODAK S'AFETY FILM
3.
KODAK
S'AFETY
FILM