Human Generated Data

Title

Prospect Park, Brooklyn

Date

1950s

People

Artist: Leon Levinstein, American 1910 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of John Erdman and Gary Schneider from the Helen Gee Collection, 2016.396

Human Generated Data

Title

Prospect Park, Brooklyn

People

Artist: Leon Levinstein, American 1910 - 1988

Date

1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of John Erdman and Gary Schneider from the Helen Gee Collection, 2016.396

Machine Generated Data

Tags

Amazon
created on 2023-07-06

Face 99.7
Head 99.7
Plant 99.5
Tree 99.5
Clothing 99.1
Photography 98.5
Person 97.2
Adult 97.2
Male 97.2
Man 97.2
Vegetation 96.9
Portrait 96.7
Body Part 96.4
Finger 96.4
Hand 96.4
Person 95.5
Person 94.8
Beard 94.7
Person 93.9
Adult 93.9
Male 93.9
Man 93.9
People 93.9
Land 82.6
Nature 82.6
Outdoors 82.6
Woodland 82.6
Hat 74.8
Happy 62.3
Cap 57.3
Smile 56.4
Hat 56.3
Sun Hat 56
Hugging 55.4
Laughing 55.3
Bonnet 55.1

Clarifai
created on 2023-10-13

people 99.8
child 96.9
woman 96.2
adult 96
two 95.7
son 95.6
group 95.1
portrait 94.7
man 94.6
monochrome 91.6
group together 90.8
boy 90.5
three 87.7
recreation 87
wear 86
girl 85.5
baby 78.7
veil 78.4
lid 76.4
interaction 75.8

Imagga
created on 2023-07-06

man 32.2
male 31.3
person 29.7
adult 28.7
people 28.4
child 21.3
couple 16.5
home 15.2
indoors 14.9
men 14.6
portrait 14.2
happy 13.8
sitting 13.7
lifestyle 13.7
life 13.4
smile 12.8
love 12.6
bed 12.3
attractive 11.9
old 11.8
happiness 11.8
computer 11.3
mother 11.1
two 11
smiling 10.8
resting 10.5
room 10.4
laptop 10.3
fan 10.2
casual 10.2
face 9.9
passenger 9.9
business 9.7
sad 9.6
boyfriend 9.6
girlfriend 9.6
black 9.6
women 9.5
work 9.4
senior 9.4
parent 9.2
holding 9.1
one 9
businessman 8.8
hairdresser 8.8
sadness 8.8
together 8.8
depression 8.8
father 8.7
wife 8.5
expression 8.5
adults 8.5
painter 8.4
relaxation 8.4
color 8.3
human 8.2
follower 8.2
groom 8.2
clothing 8
looking 8
husband 8
working 8
hair 7.9
model 7.8
bedroom 7.7
pretty 7.7
executive 7.6
females 7.6
fashion 7.5
one person 7.5
house 7.5
leisure 7.5
alone 7.3
indoor 7.3
dress 7.2
family 7.1
job 7.1
interior 7.1
day 7.1

Google
created on 2023-07-06

Smile 90.6
Black 89.7
Hat 86.2
Gesture 85.3
Black-and-white 84.8
Happy 84.2
Style 83.9
People in nature 83.3
Sun hat 81.6
Tree 81.1
Adaptation 79.3
Monochrome photography 72.4
Monochrome 72.4
Beard 66
Stock photography 64.1
Grass 61.7
Child 58.5
Sitting 57.5
Forest 57.3
Vintage clothing 57

Microsoft
created on 2023-07-06

person 99.9
outdoor 98.7
black and white 87.8
text 85.6
snow 84.8
baby 84.3
human face 77
toddler 76.9
clothing 63.5
old 41.1
crowd 2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 2-8
Gender Female, 51.9%
Happy 98.6%
Surprised 6.5%
Fear 5.9%
Sad 2.2%
Angry 0.2%
Confused 0.2%
Calm 0.1%
Disgusted 0.1%

AWS Rekognition

Age 33-41
Gender Male, 98.9%
Calm 73.5%
Sad 35.6%
Surprised 6.7%
Fear 6.1%
Angry 1%
Confused 0.7%
Disgusted 0.4%
Happy 0.4%

AWS Rekognition

Age 28-38
Gender Female, 100%
Happy 56.3%
Calm 14.9%
Surprised 10.5%
Fear 7.6%
Sad 7.3%
Angry 3.8%
Confused 2.7%
Disgusted 2.6%

Microsoft Cognitive Services

Age 5
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.2%
Adult 97.2%
Male 97.2%
Man 97.2%
Hat 74.8%

Categories

Text analysis

Google

C