Human Generated Data

Title

Untitled (South Street piers, New York City)

Date

1932-1934

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4221

Human Generated Data

Title

Untitled (South Street piers, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1934

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4221

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.7
Human 99.7
Person 99.7
Person 99.4
Person 99.1
Person 94.4
Person 93.4
Nature 84.2
Outdoors 80.7
People 75
Building 72.2
Person 71.2
Person 62.5
Crowd 61.1
Text 60.8
Architecture 60.7
Person 60.3
Person 56.1

Clarifai
created on 2023-10-25

people 98.1
slide 96.7
movie 96.6
negative 96
collage 95.6
filmstrip 95.3
wear 94.7
adult 94.5
retro 94.5
group 92.2
vintage 91.9
old 90.9
woman 89.8
man 88.5
sepia 86.5
art 86.3
cinematography 84.6
noisy 84.2
sepia pigment 83.8
street 83.2

Imagga
created on 2022-01-08

architecture 35.7
old 30.6
ancient 26.8
building 22.7
vintage 22.3
facade 22.2
grunge 19.6
antique 19.3
paper 18.3
art 18.1
travel 17.6
historic 17.4
texture 17.4
city 16.7
structure 16.5
sculpture 15.7
culture 14.5
landmark 14.4
wall 14.4
temple 14.2
monument 14
aged 13.6
house 13.5
history 13.4
retro 13.1
drawing 12.8
brown 12.5
construction 12
currency 11.7
religion 11.6
bill 11.4
detail 11.3
style 11.1
stone 10.4
historical 10.4
famous 10.2
town 10.2
church 10.2
sketch 10.2
exterior 10.1
tourism 9.9
balcony 9.8
artistic 9.6
statue 9.5
money 9.4
carving 9.3
finance 9.3
palace 9.2
note 9.2
cash 9.1
crumpled 8.7
urban 8.7
text 8.7
brick 8.7
empty 8.6
old fashioned 8.6
frame 8.6
grain 8.3
dirty 8.1
tower 8.1
design 7.9
sepia 7.8
architectural 7.7
worn 7.6
damaged 7.6
capital 7.6
cityscape 7.6
buildings 7.6
material 7.5
pattern 7.5
window 7.5
representation 7.5
dollar 7.4
letter 7.3
business 7.3

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 92.6
person 86.3
clothing 83
old 55.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-33
Gender Male, 100%
Calm 88.4%
Happy 4.6%
Sad 3.2%
Surprised 1.1%
Fear 1%
Angry 0.6%
Confused 0.5%
Disgusted 0.4%

AWS Rekognition

Age 27-37
Gender Male, 98.9%
Calm 91.6%
Sad 3.7%
Confused 2.6%
Happy 0.8%
Angry 0.6%
Disgusted 0.3%
Surprised 0.2%
Fear 0.2%

AWS Rekognition

Age 10-18
Gender Female, 93.9%
Calm 95.8%
Sad 2.5%
Confused 0.4%
Angry 0.3%
Surprised 0.3%
Fear 0.3%
Disgusted 0.2%
Happy 0.2%

AWS Rekognition

Age 23-31
Gender Male, 94.4%
Sad 19.7%
Calm 17.8%
Confused 16.1%
Happy 13.1%
Disgusted 12.3%
Angry 10.7%
Surprised 8.1%
Fear 2.3%

AWS Rekognition

Age 26-36
Gender Male, 53.4%
Calm 77.2%
Angry 5%
Disgusted 4.7%
Sad 4%
Happy 4%
Surprised 2.7%
Fear 1.5%
Confused 0.9%

AWS Rekognition

Age 23-31
Gender Male, 99.6%
Calm 85.7%
Sad 6%
Happy 4.5%
Angry 2.7%
Surprised 0.3%
Fear 0.3%
Confused 0.3%
Disgusted 0.2%

AWS Rekognition

Age 40-48
Gender Male, 98.2%
Calm 54.2%
Happy 24.7%
Sad 15.4%
Confused 1.4%
Surprised 1.4%
Disgusted 1.2%
Angry 1%
Fear 0.6%

Feature analysis

Amazon

Person 99.7%

Categories

Text analysis

Amazon

OVNCHBUNVITC

Google

HBUWVITC
HBUWVITC