Human Generated Data

Title

Untitled

Date

1960s-1975

People

Artist: Garry Winogrand, American 1928 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Fernando Barnuevo, P2001.68

Copyright

© The Estate of Garry Winogrand, courtesy Fraenkel Gallery

Human Generated Data

Title

Untitled

People

Artist: Garry Winogrand, American 1928 - 1984

Date

1960s-1975

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Fernando Barnuevo, P2001.68

Copyright

© The Estate of Garry Winogrand, courtesy Fraenkel Gallery

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 98.7
Human 98.7
Person 96.7
Person 95.1
Person 94.3
Sitting 94.3
Apparel 91.9
Clothing 91.9
Shorts 87.9
Person 86.9
Restaurant 86.8
Chair 86.4
Furniture 86.4
Meal 64.8
Food 64.8
Cafeteria 64.7
Cafe 63.1
Leisure Activities 62.8
Advertisement 61.2

Clarifai
created on 2023-10-25

people 99.7
monochrome 97.9
woman 95.6
street 95.1
adult 94.9
group together 93.9
family 93.1
portrait 92.7
child 91.6
group 90.2
window 88.8
man 86.2
room 85.9
girl 84.8
furniture 77
boy 76.7
documentary 76.6
school 75.3
indoors 75
analogue 74.6

Imagga
created on 2022-01-08

man 25.5
people 22.9
person 20.1
male 19.9
black 18.8
adult 17.5
night 16.9
silhouette 14.9
portrait 14.2
window 13.8
world 13
lifestyle 13
nightlife 12.7
love 12.6
spectator 12.3
couple 12.2
dark 11.7
shop 11.4
barbershop 11.4
one 11.2
sitting 11.2
men 11.2
women 11.1
human 10.5
urban 10.5
entertainment 10.1
leisure 10
modern 9.8
posing 9.8
business 9.7
friends 9.4
glass 9.3
city 9.1
sensuality 9.1
mercantile establishment 9.1
music 9
fun 9
clothing 9
style 8.9
group 8.9
nightclub 8.8
looking 8.8
dance 8.8
light 8.7
party 8.6
relaxation 8.4
holding 8.2
television 8.2
happy 8.1
body 8
businessman 7.9
model 7.8
billboard 7.8
motion 7.7
youth 7.7
casual 7.6
passion 7.5
photographer 7.5
teenager 7.3
art 7.3
building 7.2
disco 7.2
romance 7.1
interior 7.1
office 7.1
happiness 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

person 97.9
clothing 97.2
text 93
black and white 90.3
man 79.8
woman 62.5
white 62.2
human face 59.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-31
Gender Female, 99.9%
Calm 60.5%
Angry 17.4%
Sad 11.6%
Confused 3.5%
Happy 2.2%
Disgusted 1.9%
Surprised 1.8%
Fear 1.2%

AWS Rekognition

Age 25-35
Gender Female, 99.9%
Sad 47.7%
Confused 27.3%
Calm 13%
Disgusted 4.9%
Fear 3.1%
Angry 2.1%
Surprised 1%
Happy 0.8%

AWS Rekognition

Age 41-49
Gender Male, 96.8%
Calm 99.5%
Surprised 0.4%
Sad 0%
Confused 0%
Disgusted 0%
Happy 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 19-27
Gender Female, 100%
Happy 99.6%
Calm 0.1%
Sad 0.1%
Surprised 0.1%
Fear 0%
Angry 0%
Disgusted 0%
Confused 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.7%

Categories

Imagga

interior objects 74.3%
paintings art 22.4%