Human Generated Data

Title

Oceanside

Date

c. 1978

People

Artist: Eric Baden, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of Apeiron Workshops, 2.2002.1778

Copyright

© Eric Baden

Human Generated Data

Title

Oceanside

People

Artist: Eric Baden, American

Date

c. 1978

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of Apeiron Workshops, 2.2002.1778

Copyright

© Eric Baden

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.5
Human 99.5
Person 99.5
Person 99.4
Person 97.7
Person 97.7
Person 93.9
Person 93
Clothing 92.6
Apparel 92.6
Dog 86.3
Mammal 86.3
Animal 86.3
Canine 86.3
Pet 86.3
Person 85.6
Person 85.4
Person 84.4
People 78.2
Face 75.4
Female 61
Shorts 55.7
Hair 55.5
Overcoat 55.4
Coat 55.4

Clarifai
created on 2023-10-26

people 99.9
group 99.8
child 99.4
family 98.7
group together 97.9
woman 97.8
portrait 96.6
adult 94.8
offspring 94.3
man 94.2
son 93.4
three 93.4
monochrome 91.7
baby 91.5
retro 90.4
many 89.6
four 88.2
leader 87.7
sibling 86.6
music 84.1

Imagga
created on 2022-01-23

newspaper 43.7
product 34.2
creation 27.2
man 22.2
people 20.6
person 18.9
home 18.3
male 16.4
adult 15.1
kin 14.7
couple 13.9
portrait 13.6
happy 13.1
attractive 12.6
family 12.4
vintage 12.4
black 12.1
happiness 11.7
interior 11.5
room 10.9
dress 10.8
smile 10.7
couch 10.6
old 10.4
sexy 10.4
love 10.2
house 10
indoors 9.7
antique 9.5
sitting 9.4
smiling 9.4
face 9.2
art 9.2
indoor 9.1
fashion 9
human 9
posing 8.9
child 8.9
boy 8.7
lifestyle 8.7
mother 8.5
pretty 8.4
dark 8.3
inside 8.3
romantic 8
hair 7.9
together 7.9
model 7.8
world 7.8
daily 7.7
men 7.7
retro 7.4
window 7.3
looking 7.2
holiday 7.2
history 7.1
handsome 7.1
night 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 99.2
clothing 98.6
person 98.3
human face 96.8
smile 94.4
baby 92
newspaper 91.1
toddler 85.5
woman 81.2
man 75.5
group 63.3
boy 57.6
child 56.3
posing 41.3
picture frame 11.6
crowd 1.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 25-35
Gender Female, 100%
Angry 86.2%
Confused 12.2%
Sad 0.8%
Calm 0.4%
Disgusted 0.2%
Surprised 0.1%
Fear 0.1%
Happy 0%

AWS Rekognition

Age 66-76
Gender Female, 55.5%
Calm 81.9%
Sad 4.9%
Disgusted 4.3%
Confused 4.1%
Angry 2.7%
Surprised 1%
Happy 0.6%
Fear 0.4%

AWS Rekognition

Age 1-7
Gender Male, 94.3%
Sad 77.2%
Angry 9.4%
Calm 5.1%
Fear 3.2%
Confused 3.2%
Disgusted 1.2%
Surprised 0.5%
Happy 0.2%

AWS Rekognition

Age 33-41
Gender Female, 99.7%
Disgusted 42.6%
Calm 19.3%
Sad 18%
Confused 10.3%
Angry 3.1%
Happy 3%
Surprised 2%
Fear 1.6%

AWS Rekognition

Age 54-64
Gender Female, 100%
Confused 42.1%
Calm 39.4%
Angry 9.7%
Sad 3.7%
Fear 2.5%
Disgusted 1.3%
Surprised 1.1%
Happy 0.2%

AWS Rekognition

Age 6-16
Gender Female, 97%
Calm 97.2%
Happy 1.6%
Sad 0.3%
Confused 0.3%
Disgusted 0.2%
Surprised 0.2%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 22-30
Gender Female, 98.2%
Sad 56.9%
Angry 26.9%
Calm 12.8%
Disgusted 1%
Fear 0.8%
Confused 0.8%
Happy 0.4%
Surprised 0.3%

AWS Rekognition

Age 21-29
Gender Female, 85.8%
Sad 58.1%
Calm 17.9%
Angry 10.6%
Surprised 6.5%
Confused 2.7%
Disgusted 1.7%
Fear 1.3%
Happy 1.2%

Microsoft Cognitive Services

Age 63
Gender Female

Microsoft Cognitive Services

Age 3
Gender Male

Microsoft Cognitive Services

Age 29
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Dog 86.3%

Categories

Imagga

people portraits 84.1%
paintings art 15%

Text analysis

Amazon

345