Human Generated Data

Title

Untitled (South Street pier, New York City)

Date

1932-1934

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4230

Human Generated Data

Title

Untitled (South Street pier, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1934

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4230

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.3
Human 99.3
Person 98.6
Person 98.6
Person 98.1
Person 98.1
Person 88.3
Poster 79.7
Advertisement 79.7
Astronaut 76.9
Person 76.1
Horse 74.3
Animal 74.3
Mammal 74.3
Person 73
Person 65.7
Plywood 64.2
Wood 64.2
Person 62

Clarifai
created on 2023-10-25

people 99
wear 97.1
adult 97
two 96.3
man 93
collage 92.4
art 92
group 91.3
picture frame 88.4
woman 87.3
vintage 87.1
old 86.5
retro 85.3
painting 84
portrait 83.4
veil 82.7
three 82.6
military 79.9
no person 79.1
one 78.6

Imagga
created on 2022-01-08

stringed instrument 56.8
musical instrument 49.3
device 32.2
dulcimer 29
vessel 28.4
schooner 25.3
old 20.2
sailing vessel 18.7
wood 18.3
construction 18
architecture 16.4
sea 15.6
ship 15.2
wooden 14.9
sky 14.7
craft 14.3
boat 14.3
psaltery 13.9
travel 12.7
equipment 12.4
building 12.1
house 11.7
ocean 11.6
home 11.2
industry 11.1
work 10.3
religion 9.9
ancient 9.5
roof 9
steel 8.8
wall 8.5
culture 8.5
structure 8.5
religious 8.4
coast 8.1
transportation 8.1
carpenter 8
metal 8
builder 7.9
holiday 7.9
marimba 7.8
harbor 7.7
god 7.6
beach 7.6
percussion instrument 7.5
art 7.5
traditional 7.5
tourism 7.4
church 7.4
rope 7.4
water 7.3
yellow 7.3
industrial 7.3
detail 7.2
vehicle 7.1
machine 7.1
summer 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 90.7
indoor 88.5
person 88.1
old 52.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 40-48
Gender Male, 99.8%
Calm 32.2%
Surprised 24.1%
Disgusted 14.7%
Confused 10%
Angry 8.8%
Fear 4%
Sad 3.4%
Happy 2.7%

AWS Rekognition

Age 51-59
Gender Male, 99.3%
Calm 90.3%
Confused 3.8%
Sad 3.2%
Angry 1.1%
Surprised 0.6%
Disgusted 0.4%
Fear 0.4%
Happy 0.3%

AWS Rekognition

Age 23-31
Gender Male, 89.3%
Calm 73.4%
Happy 22%
Surprised 1.8%
Angry 0.8%
Sad 0.7%
Disgusted 0.6%
Confused 0.4%
Fear 0.3%

AWS Rekognition

Age 45-51
Gender Male, 98.6%
Calm 74.8%
Disgusted 11.6%
Surprised 3.5%
Confused 2.8%
Angry 2.8%
Sad 1.9%
Happy 1.4%
Fear 1.2%

AWS Rekognition

Age 20-28
Gender Female, 54.5%
Calm 99.2%
Sad 0.2%
Disgusted 0.1%
Happy 0.1%
Fear 0.1%
Angry 0%
Surprised 0%
Confused 0%

AWS Rekognition

Age 53-61
Gender Female, 79.1%
Sad 96.1%
Calm 1%
Confused 0.8%
Angry 0.7%
Disgusted 0.6%
Happy 0.3%
Fear 0.3%
Surprised 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Poster 79.7%
Horse 74.3%

Categories

Imagga

pets animals 99.5%

Text analysis

Amazon

DO
DO BONT
BONT

Google

00 0 00 000
00
0
000