Human Generated Data

Title

Sea Cliff

Date

1979

People

Artist: Larry White, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of Apeiron Workshops, 2.2002.1619

Human Generated Data

Title

Sea Cliff

People

Artist: Larry White, American 20th century

Date

1979

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of Apeiron Workshops, 2.2002.1619

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Wheel 99.9
Machine 99.9
Person 99.5
Human 99.5
Bicycle 99.2
Bike 99.2
Vehicle 99.2
Transportation 99.2
Person 97.4
Person 96.9
Cow 95.5
Animal 95.5
Mammal 95.5
Cattle 95.5
Outdoors 92.6
Nature 88.4
Person 84
Person 82
Shelter 80.6
Countryside 80.6
Building 80.6
Rural 80.6
Person 69.9
Furniture 62.6
People 61.5

Clarifai
created on 2023-10-25

people 100
group together 99.3
monochrome 99.1
child 98.5
group 98.4
adult 98.3
man 96.7
many 96.3
street 96
furniture 93.8
chair 93.4
woman 92.6
boy 92
campsite 91.2
vehicle 90.3
war 90.3
recreation 89
beach 88.3
seat 87.6
home 86.7

Imagga
created on 2022-01-09

seller 38.2
bench 26.9
park bench 24.2
outdoors 21.3
seat 18.4
wheeled vehicle 17.2
outdoor 16.8
people 16.7
old 16
sky 15.9
cemetery 15.2
vehicle 15.1
landscape 14.1
architecture 14.1
barrow 13.7
travel 13.4
park 13.2
wheelchair 12.4
summer 12.2
man 12.1
city 11.6
furniture 11.6
building 11.5
child 11.3
handcart 10.8
vacation 10.6
day 10.2
trees 9.8
tree 9.5
outside 9.4
street 9.2
history 8.9
male 8.9
tricycle 8.7
water 8.7
antique 8.7
clouds 8.4
house 8.4
sea 7.8
color 7.8
ancient 7.8
person 7.6
beach 7.6
church 7.4
conveyance 7.4
mountain 7.1
family 7.1
grass 7.1
love 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

outdoor 98.2
text 95.2
child 93.9
person 83.5
bicycle 83.3
black and white 78
clothing 67.9
boy 64.4
toddler 61.6
wheel 58.3
land vehicle 53.3
old 47.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 7-17
Gender Male, 99.6%
Happy 89.7%
Confused 3.3%
Angry 2.3%
Sad 1.4%
Disgusted 1.3%
Fear 1.1%
Surprised 0.6%
Calm 0.3%

AWS Rekognition

Age 2-8
Gender Male, 61.9%
Happy 50.5%
Angry 22.9%
Sad 9.3%
Confused 7.3%
Calm 4%
Surprised 2.8%
Disgusted 1.9%
Fear 1.3%

AWS Rekognition

Age 31-41
Gender Male, 99.9%
Calm 32.7%
Happy 24.1%
Sad 20.9%
Confused 7.5%
Surprised 5.8%
Disgusted 3.4%
Angry 3%
Fear 2.7%

AWS Rekognition

Age 26-36
Gender Male, 95.1%
Happy 87.5%
Fear 7.9%
Calm 1.7%
Angry 1%
Surprised 0.8%
Sad 0.4%
Disgusted 0.4%
Confused 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Wheel 99.9%
Person 99.5%
Bicycle 99.2%
Cow 95.5%

Categories