Human Generated Data

Title

Untitled (Sixth Avenue, New York City)

Date

1932-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4257

Human Generated Data

Title

Untitled (Sixth Avenue, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4257

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.7
Human 99.7
Person 99.4
Person 99
Clothing 97.1
Apparel 97.1
Train 93.3
Vehicle 93.3
Transportation 93.3
Face 69.6
Furniture 66
Portrait 65.8
Photography 65.8
Photo 65.8
Door 65.6
Undershirt 65.3
Hat 64.7
Mirror 64.6
Window 61.8
Cabinet 55

Clarifai
created on 2023-10-25

people 98.9
negative 97.8
movie 97
portrait 96.9
art 95.5
monochrome 95.5
analogue 94.1
man 94
wear 93.9
adult 93.4
vintage 93
collage 93
street 92.5
slide 91.9
window 91.4
margin 91.4
locomotive 89.9
train 86.6
retro 86.5
one 85.5

Imagga
created on 2022-01-08

passenger 33.6
building 21
black 18.7
people 17.3
architecture 16.5
office 15.9
train 15.1
window 15.1
man 14.8
old 14.6
business 14.6
city 14.1
subway train 13.4
travel 13.4
shop 12.4
urban 12.2
person 11.8
device 11
male 10.6
interior 10.6
conveyance 10.6
public transport 10.3
sitting 10.3
light 10
transportation 9.9
elevator 9.5
men 9.4
historic 9.2
barbershop 9.1
modern 9.1
adult 9
music 9
history 8.9
working 8.8
street 8.3
tourism 8.2
alone 8.2
door 8.1
world 8.1
night 8
room 7.9
glass 7.8
mercantile establishment 7.6
lifting device 7.6
house 7.5
dark 7.5
silhouette 7.4
vintage 7.4
piano 7.4
chair 7.4
inside 7.4
transport 7.3
detail 7.2
home 7.2
portrait 7.1
work 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

person 92.6
man 90.8
text 88.4
music 82.7
clothing 79.6
subway 10.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 2-10
Gender Male, 99.9%
Calm 97.5%
Fear 1.3%
Sad 0.4%
Surprised 0.3%
Happy 0.2%
Angry 0.1%
Confused 0.1%
Disgusted 0.1%

AWS Rekognition

Age 24-34
Gender Male, 89%
Calm 74.8%
Sad 6.8%
Angry 4.5%
Disgusted 4.2%
Fear 3.5%
Happy 3.3%
Confused 2%
Surprised 0.8%

AWS Rekognition

Age 25-35
Gender Male, 99.4%
Confused 72.3%
Disgusted 8.9%
Calm 5.6%
Happy 4.6%
Fear 3.2%
Sad 2.2%
Angry 1.7%
Surprised 1.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Train 93.3%

Categories

Text analysis

Google

VMCHBOW IC O00000OOL T831 3 3 t t
VMCHBOW
IC
O00000OOL
T831
3
t