Human Generated Data

Title

Farmingdale, L.I.

Date

1979

People

Artist: Eric Baden, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of Apeiron Workshops, 2.2002.68

Copyright

© Eric Baden

Human Generated Data

Title

Farmingdale, L.I.

People

Artist: Eric Baden, American

Date

1979

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of Apeiron Workshops, 2.2002.68

Copyright

© Eric Baden

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.7
Human 99.7
Person 99.4
Person 99.2
Person 98.2
Crowd 91.2
Face 85
Clothing 71.7
Apparel 71.7
Pub 71
Bar Counter 69.5
Audience 66.3
Suit 57.6
Coat 57.6
Overcoat 57.6
People 56.6
Performer 55.7

Clarifai
created on 2023-10-25

people 99.8
group 99.4
group together 98.9
man 98.6
woman 96.8
adult 96.3
music 95
portrait 94.3
party 89.9
audience 89.4
many 88.9
musician 86.1
several 85.4
movie 84.2
crowd 84.1
band 80.6
nightclub 80.6
recreation 80.1
singer 78.6
wear 78.3

Imagga
created on 2022-01-08

black 25.6
man 25.5
person 25.1
portrait 23.3
adult 22.1
people 20.6
male 17.8
dark 15.9
fashion 15.8
style 15.6
attractive 15.4
dress 14.5
sitting 13.7
sexy 13.6
mask 12.8
face 12.8
model 12.4
eyes 12
one 11.9
hair 11.9
studio 11.4
human 11.2
pretty 11.2
horror 10.7
happy 10.6
head 10.1
world 9.9
evil 9.7
brunette 9.6
art 9.5
smile 9.3
elegance 9.2
night 8.9
witch 8.8
scary 8.7
fear 8.7
men 8.6
expression 8.5
clothing 8.5
sensual 8.2
costume 8
look 7.9
couple 7.8
spooky 7.8
covering 7.8
death 7.7
elegant 7.7
mystery 7.7
winter 7.7
lady 7.3
sensuality 7.3
smiling 7.2
lifestyle 7.2
suit 7.2
looking 7.2
love 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

human face 97.5
clothing 96.9
text 92.9
person 92.9
smile 91
man 85.9
woman 84.3
black and white 78.4
glasses 76.1
people 61
picture frame 14

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 25-35
Gender Male, 99.8%
Surprised 56%
Calm 14.3%
Confused 12.9%
Angry 6.4%
Fear 5.3%
Disgusted 2.8%
Sad 1.2%
Happy 1%

AWS Rekognition

Age 20-28
Gender Female, 96%
Sad 57.6%
Calm 39.3%
Confused 0.8%
Angry 0.8%
Fear 0.7%
Surprised 0.3%
Disgusted 0.3%
Happy 0.2%

AWS Rekognition

Age 18-24
Gender Female, 80.6%
Calm 73.9%
Angry 11.2%
Surprised 5.8%
Happy 2.8%
Confused 2.3%
Fear 1.8%
Sad 1.8%
Disgusted 0.4%

Microsoft Cognitive Services

Age 43
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Text analysis

Google

Love/es
Love/es