Human Generated Data

Title

Untitled (contact sheet)

Date

20th century

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4574.1-4

Human Generated Data

Title

Untitled (contact sheet)

People

Artist: Ben Shahn, American 1898 - 1969

Date

20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4574.1-4

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.3
Human 99.3
Person 98.4
Person 98.1
Person 97.7
Person 97.3
Person 97.3
Person 97.1
Person 96.7
Person 96.5
Person 91.3
Person 85.9
Person 85.9
Person 75.2
Text 68.9
Car 68.4
Transportation 68.4
Vehicle 68.4
Automobile 68.4
Monitor 67.2
Electronics 67.2
Screen 67.2
Display 67.2
Person 65.4
Shop 60.5
Bakery 57.2

Clarifai
created on 2023-10-25

movie 99.3
people 99.1
negative 97.1
group 95
filmstrip 93.5
two 92.8
wear 91.9
group together 91.8
man 90.6
exposed 90.3
street 89.2
art 88.1
woman 87.5
adult 86.9
slide 85.9
screen 84.2
noisy 84
military 83.8
collage 83.6
room 81.4

Imagga
created on 2022-01-08

equipment 49.3
sequencer 38.5
film 37.1
negative 35.7
electronic equipment 31.8
apparatus 31.6
city 19.1
photographic paper 16.8
monitor 16
travel 14.1
architecture 14.1
urban 14
business 13.4
building 13
grunge 12.8
screen 11.9
sea 11.7
water 11.3
photographic equipment 11.2
old 11.1
art 11.1
strip 10.7
movie 10.7
retro 10.6
landscape 10.4
light 10
frame 10
vintage 9.9
night 9.8
black 9.6
entertainment 9.2
ocean 9.1
sky 8.9
digital 8.9
cinema 8.8
port 8.7
scene 8.6
construction 8.5
industry 8.5
buildings 8.5
design 8.4
people 8.4
camera 8.3
transportation 8.1
computer 8
river 8
house 7.5
case 7.4
technology 7.4
television 7.4
vacation 7.4
speed 7.3
transport 7.3
equalizer 7.3
border 7.2
landmark 7.2
amplifier 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 94
indoor 89.4
screenshot 75.9
picture frame 62.8

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 24-34
Gender Female, 88.7%
Calm 47%
Disgusted 27.2%
Sad 14.9%
Happy 3.4%
Surprised 3.2%
Fear 1.7%
Angry 1.5%
Confused 1%

AWS Rekognition

Age 6-16
Gender Male, 94.5%
Calm 79.2%
Confused 9.6%
Sad 5.3%
Disgusted 1.8%
Angry 1.4%
Fear 1.3%
Surprised 0.8%
Happy 0.6%

AWS Rekognition

Age 18-24
Gender Male, 95.5%
Calm 84.8%
Sad 8.7%
Angry 2.7%
Surprised 1.2%
Disgusted 1%
Confused 0.8%
Fear 0.4%
Happy 0.4%

AWS Rekognition

Age 19-27
Gender Male, 97.1%
Calm 97%
Angry 0.7%
Happy 0.7%
Sad 0.5%
Fear 0.3%
Disgusted 0.3%
Surprised 0.2%
Confused 0.2%

AWS Rekognition

Age 19-27
Gender Male, 87.2%
Calm 77.8%
Sad 8.2%
Happy 3.6%
Angry 3%
Surprised 2.8%
Confused 1.8%
Disgusted 1.5%
Fear 1.5%

AWS Rekognition

Age 18-24
Gender Male, 100%
Calm 97.7%
Sad 1.1%
Happy 0.4%
Angry 0.4%
Confused 0.3%
Fear 0.1%
Disgusted 0.1%
Surprised 0.1%

AWS Rekognition

Age 12-20
Gender Male, 97.8%
Disgusted 44.9%
Surprised 20.3%
Sad 14.1%
Calm 10.9%
Angry 5.2%
Confused 2.2%
Happy 1.7%
Fear 0.8%

AWS Rekognition

Age 18-24
Gender Female, 58.1%
Happy 64.9%
Calm 17.8%
Sad 5.7%
Confused 4.5%
Angry 4%
Disgusted 1.2%
Fear 1%
Surprised 0.9%

AWS Rekognition

Age 19-27
Gender Female, 71.4%
Calm 68.6%
Sad 11.4%
Confused 11%
Angry 5.1%
Surprised 2%
Disgusted 0.8%
Fear 0.6%
Happy 0.6%

AWS Rekognition

Age 20-28
Gender Male, 97.8%
Calm 97.3%
Sad 1.4%
Happy 0.4%
Surprised 0.3%
Fear 0.2%
Confused 0.2%
Disgusted 0.1%
Angry 0.1%

AWS Rekognition

Age 16-22
Gender Male, 99.4%
Calm 85.6%
Sad 7.2%
Angry 2.2%
Happy 1.3%
Fear 1.1%
Disgusted 1%
Confused 0.8%
Surprised 0.7%

Feature analysis

Amazon

Person 99.3%
Car 68.4%
Monitor 67.2%

Categories

Imagga

cars vehicles 99.8%

Text analysis

Amazon

14
13
12
SUPER
11
SUPER XX
XX
11:
EASTINAAN
-1

Google

EASTIMA N II: SUPER XX 13
EASTIMA
N
II:
SUPER
XX
13