Human Generated Data

Title

Untitled (Laguna Beach)

Date

1979

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5185

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (Laguna Beach)

People

Artist: Bill Dane, American born 1938

Date

1979

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5185

Copyright

© Bill Dane

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Human 99.8
Person 99.8
Person 99.8
Person 99.8
Person 99.1
Person 99
Person 98.5
Apparel 98.3
Clothing 98.3
Person 98.2
Transportation 95.6
Vehicle 95.6
Automobile 95.6
Car 95.6
Back 89.1
Car 82
Crowd 74.2
Bench 73.2
Furniture 73.2
Person 70.4
Swimwear 69.1
Person 63.5
Meal 63.4
Food 63.4
People 63.2
Shorts 62.2
Car 61.9
Leisure Activities 60.9
Bikini 59.5
Skin 56.4

Clarifai
created on 2019-11-15

people 99.9
group together 99.6
many 98.9
adult 96.4
group 96.3
woman 95.1
man 94.5
monochrome 92.2
street 91.4
several 89.9
athlete 89.3
recreation 88.9
vehicle 88.1
crowd 84.9
spectator 84.2
wear 84.1
child 83.2
transportation system 78.2
military 75
competition 73.3

Imagga
created on 2019-11-15

people 25.1
man 24.9
person 22
world 21.7
sport 19.5
outdoors 17.7
portrait 15.5
pedestrian 15.1
adult 15
black 14.4
male 14.4
walking 14.2
silhouette 14.1
sky 14
outdoor 13.8
city 13.3
vacation 13.1
lifestyle 13
beach 12.8
active 12.6
couple 12.2
athlete 12
sunset 11.7
together 11.4
urban 11.4
boy 11.3
women 11.1
summer 10.9
dress 10.8
fun 10.5
two 10.2
travel 9.9
sun 9.7
walk 9.5
men 9.4
military uniform 8.7
run 8.7
water 8.7
uniform 8.6
life 8.6
model 8.6
dark 8.3
clothing 8.3
leisure 8.3
street 8.3
park 8.2
girls 8.2
happy 8.1
activity 8.1
family 8
love 7.9
happiness 7.8
pretty 7.7
old 7.7
dusk 7.6
runner 7.5
fashion 7.5
child 7.5
human 7.5
shore 7.4
action 7.4
teen 7.3
back 7.3
protection 7.3
exercise 7.3
group 7.3
sexy 7.2
holiday 7.2
kin 7.1

Google
created on 2019-11-15

Microsoft
created on 2019-11-15

text 99.3
clothing 97.4
outdoor 94.1
woman 91.9
person 87.9
black and white 77.6
man 76.5
footwear 76
girl 66.1
sky 64.9
people 62.2
group 60.1
posing 45.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 32-48
Gender Male, 51.2%
Calm 45%
Surprised 45%
Angry 45.1%
Confused 45%
Happy 45%
Fear 45%
Sad 54.8%
Disgusted 45%

AWS Rekognition

Age 27-43
Gender Female, 53.4%
Calm 54.6%
Fear 45%
Disgusted 45%
Surprised 45%
Happy 45%
Angry 45%
Confused 45%
Sad 45.4%

AWS Rekognition

Age 23-35
Gender Male, 54.7%
Calm 47.6%
Happy 45.2%
Angry 47.4%
Disgusted 45.2%
Fear 45.5%
Sad 48.4%
Confused 45.7%
Surprised 45.1%

AWS Rekognition

Age 34-50
Gender Male, 50.4%
Surprised 49.7%
Happy 49.7%
Confused 49.5%
Sad 49.5%
Angry 49.7%
Fear 49.6%
Disgusted 49.5%
Calm 49.7%

AWS Rekognition

Age 37-55
Gender Male, 50.3%
Sad 50.6%
Surprised 46%
Confused 45.4%
Fear 46.5%
Angry 45.4%
Happy 45%
Calm 46.2%
Disgusted 45%

AWS Rekognition

Age 29-45
Gender Male, 50.6%
Surprised 45%
Sad 45.3%
Confused 45%
Happy 45%
Disgusted 45%
Fear 45%
Angry 45%
Calm 54.6%

AWS Rekognition

Age 38-56
Gender Male, 51.1%
Confused 45.1%
Fear 45.5%
Calm 46%
Happy 45%
Sad 51.8%
Angry 46.4%
Surprised 45.3%
Disgusted 45%

Feature analysis

Amazon

Person 99.8%
Car 95.6%
Bench 73.2%