Human Generated Data

Title

Smithhaven Shopping Center

Date

1978

People

Artist: Eric Baden, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of Apeiron Workshops, 2.2002.64

Copyright

© Eric Baden

Human Generated Data

Title

Smithhaven Shopping Center

People

Artist: Eric Baden, American

Date

1978

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of Apeiron Workshops, 2.2002.64

Copyright

© Eric Baden

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.8
Human 99.8
Person 99.6
Person 99.5
Person 99.3
Dj 93.6
Machine 72.1
Clothing 63.8
Apparel 63.8
Motor 58.1
Engine 55.7
Person 41.6

Clarifai
created on 2023-10-25

people 99.8
child 97.9
group 97.9
adult 97.8
man 97.2
woman 96.7
group together 94.8
portrait 92.2
education 91.7
family 91.1
boy 91.1
three 91
adolescent 90
music 89.4
monochrome 88.3
wireless communication 87.9
four 84.9
retro 84.8
indoors 84
several 82.1

Imagga
created on 2022-01-08

man 39.6
person 33.5
laptop 26.2
computer 25.9
male 25.5
people 25.1
professional 22.7
office 22.6
work 21.3
business 21.2
brass 20.3
adult 19.1
disk jockey 18.4
working 17.7
technology 17.1
smile 16.4
musical instrument 16.2
wind instrument 16
businessman 15.9
men 15.4
broadcaster 14.7
sitting 14.6
happy 14.4
worker 14.2
modern 14
desk 13.8
table 13.4
lifestyle 13
job 12.4
student 12.2
executive 12.2
businesswoman 11.8
communicator 11.1
communication 10.9
house 10.9
device 10.6
attractive 10.5
couple 10.4
portrait 10.3
corporate 10.3
expression 10.2
smiling 10.1
handsome 9.8
looking 9.6
home 9.6
senior 9.4
equipment 9.3
notebook 9.2
room 9.1
interior 8.8
boy 8.7
education 8.7
black 8.5
alone 8.2
cheerful 8.1
suit 8.1
sport 7.9
chair 7.8
play 7.8
youth 7.7
relax 7.6
one 7.5
mature 7.4
music 7.4
indoors 7
together 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

drawing 96.3
person 93.1
text 89.6
clothing 88.9
sketch 86.7
black and white 81.1
cartoon 76.6
man 53.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 11-19
Gender Male, 63.2%
Calm 75.8%
Sad 18.8%
Confused 2.5%
Angry 0.8%
Surprised 0.7%
Disgusted 0.6%
Fear 0.5%
Happy 0.4%

AWS Rekognition

Age 6-12
Gender Male, 99.7%
Calm 95.1%
Sad 4.3%
Angry 0.3%
Confused 0.1%
Fear 0.1%
Disgusted 0.1%
Surprised 0%
Happy 0%

AWS Rekognition

Age 31-41
Gender Female, 83.2%
Sad 94.6%
Disgusted 1.8%
Fear 1.8%
Calm 1.4%
Confused 0.2%
Angry 0.2%
Happy 0.1%
Surprised 0%

AWS Rekognition

Age 18-24
Gender Female, 82.3%
Calm 50.2%
Sad 37.1%
Disgusted 4%
Happy 3.9%
Surprised 1.7%
Fear 1.3%
Confused 1%
Angry 0.7%

AWS Rekognition

Age 19-27
Gender Female, 96.1%
Calm 49.8%
Sad 46.7%
Fear 1.3%
Confused 0.8%
Angry 0.5%
Disgusted 0.4%
Surprised 0.3%
Happy 0.2%

Microsoft Cognitive Services

Age 24
Gender Male

Microsoft Cognitive Services

Age 6
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%

Categories

Imagga

paintings art 76%
people portraits 22.6%

Captions

Microsoft
created on 2022-01-08

text 65.8%

Text analysis

Amazon

HESS

Google

HESS
HESS