Human Generated Data

Title

Untitled (Artists' Union demonstrators, New York City)

Date

1934-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4225

Human Generated Data

Title

Untitled (Artists' Union demonstrators, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1934-1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4225

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Human 99.6
Person 99.6
Person 99.6
Person 99.5
Person 99.2
Person 98.5
Person 94.7
Shop 90
Car 89.6
Transportation 89.6
Vehicle 89.6
Automobile 89.6
Poster 85.1
Advertisement 85.1
Machine 83.1
Wheel 83.1
Window Display 80.5
Wheel 77.4
Wheel 72.1
Person 64.1
Door 56
Restaurant 55.7
Cafeteria 55.7
Person 44.1

Clarifai
created on 2023-10-25

people 99
group 97.1
wear 95.5
adult 94.6
art 94.3
vintage 94.2
retro 94.2
man 90.3
woman 90.1
sepia pigment 89.4
old 88.5
sepia 87.5
illustration 85.7
antique 85
paper 83.4
movie 79.3
mammal 78.4
collage 77.8
child 77.6
desktop 75.2

Imagga
created on 2022-01-08

old 34.8
wall 26.7
ancient 23.3
city 22.4
vintage 22.3
grunge 22.1
texture 21.5
art 21.1
building 20.1
paper 19.9
architecture 19
antique 17.6
textured 16.7
frame 15.8
shop 15.6
retro 15.6
travel 15.5
aged 15.4
history 15.2
house 15
sculpture 15
tourism 14.8
window 14.8
historical 14.1
monument 14
historic 13.8
stone 13.6
paint 12.7
brown 12.5
dirty 11.7
material 11.6
street 11
structure 11
balcony 10.5
weathered 10.4
artistic 10.4
facade 10.3
blank 10.3
mercantile establishment 10.2
padlock 10
detail 9.7
decoration 9.5
construction 9.4
fastener 9.4
rough 9.1
landmark 9
pattern 8.9
lock 8.8
home 8.8
urban 8.7
empty 8.6
damaged 8.6
drawing 8.6
culture 8.5
wallpaper 8.4
page 8.4
exterior 8.3
device 8.1
sketch 7.8
sepia 7.8
rust 7.7
architectural 7.7
parchment 7.7
worn 7.6
rusty 7.6
statue 7.6
grungy 7.6
famous 7.4
town 7.4
design 7.3
wooden 7

Google
created on 2022-01-08

Photograph 94.2
Art 82.7
Car 80.1
Suit 80.1
Motor vehicle 79.6
Font 75.6
Tints and shades 74.5
Hat 71.6
Room 68.4
Paper product 67.6
Vintage clothing 66.7
History 66.7
Visual arts 66.1
Kit car 63.7
Stock photography 62.5
Classic 60.2
Collection 59
Working animal 57.8
Wheel 56.4
Uniform 56.1

Microsoft
created on 2022-01-08

vehicle 96.1
land vehicle 94.9
person 94
car 92.9
clothing 92.7
indoor 89.4
text 87.1
window 83.6
wheel 60.7
man 58.7
old 48.2
stone 4.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 37-45
Gender Male, 99.1%
Calm 97.4%
Sad 1.6%
Happy 0.2%
Angry 0.2%
Disgusted 0.2%
Confused 0.1%
Surprised 0.1%
Fear 0.1%

AWS Rekognition

Age 25-35
Gender Female, 99.8%
Disgusted 98.9%
Confused 0.4%
Happy 0.3%
Sad 0.1%
Calm 0.1%
Angry 0.1%
Fear 0.1%
Surprised 0%

AWS Rekognition

Age 25-35
Gender Female, 98.7%
Calm 99%
Sad 0.2%
Confused 0.2%
Surprised 0.1%
Fear 0.1%
Happy 0.1%
Disgusted 0.1%
Angry 0.1%

AWS Rekognition

Age 18-26
Gender Male, 73.3%
Calm 93.1%
Sad 4%
Angry 0.8%
Happy 0.7%
Confused 0.7%
Surprised 0.3%
Disgusted 0.2%
Fear 0.2%

AWS Rekognition

Age 28-38
Gender Male, 98.7%
Calm 98.9%
Surprised 0.7%
Happy 0.1%
Disgusted 0.1%
Angry 0.1%
Confused 0%
Sad 0%
Fear 0%

AWS Rekognition

Age 29-39
Gender Female, 71.9%
Calm 61.8%
Sad 30.6%
Happy 3.5%
Disgusted 1.8%
Angry 0.8%
Fear 0.7%
Confused 0.6%
Surprised 0.3%

AWS Rekognition

Age 18-24
Gender Female, 99.2%
Calm 90.3%
Confused 2.6%
Happy 1.9%
Angry 1.4%
Disgusted 1.3%
Surprised 1.2%
Sad 0.7%
Fear 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person 99.6%
Car 89.6%
Poster 85.1%
Wheel 83.1%

Categories

Imagga

interior objects 72.8%
paintings art 26.9%

Text analysis

Amazon

NEED
THE
LICENSE
-
HC

Google

000 00 0
000
00
0