Human Generated Data

Title

Untitled (contact sheet)

Date

20th century

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4578

Human Generated Data

Title

Untitled (contact sheet)

People

Artist: Ben Shahn, American 1898 - 1969

Date

20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4578

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Human 99.4
Person 99.4
Person 99.3
Person 99.1
Person 99
Person 98.4
Person 95.7
Person 93.8
Person 92.1
Outdoors 91.8
Person 90.8
Nature 86
Person 83.1
Snow 78.7
Screen 74.7
Electronics 74.7
Person 74
LCD Screen 72.5
Monitor 72.5
Display 72.5
Person 68
Ice 66.8
Text 62
People 59.6
Leisure Activities 59.5
Poster 58.4
Advertisement 58.4
Person 52.8
Person 52.1
Person 42.6

Clarifai
created on 2023-10-25

collage 99
movie 98.3
people 97.9
slide 97.8
negative 96.5
street 96.1
art 95.3
screen 95.1
noisy 93.9
vintage 92.9
desktop 92.2
photograph 92.1
snow 91.9
old 91.8
winter 91.7
man 91
retro 90.4
filmstrip 89.6
analogue 89.4
urban 88.7

Imagga
created on 2022-01-08

case 52.4
monitor 41.4
television 33.3
electronic equipment 30.2
aquarium 26.8
equipment 21.4
architecture 20.3
broadcasting 19.7
screen 19.5
building 17.2
old 16.7
art 15.6
window 15.4
glass 14.8
black 14.4
telecommunication 14.2
frame 14.1
vintage 14
digital 13.8
collage 13.5
film 13.4
modern 13.3
retro 13.1
design 12.9
business 12.7
city 12.5
urban 12.2
windows 11.5
home 11.2
technology 11.1
grunge 11.1
house 10.9
sliding door 10.7
your 10.6
office 10.6
structure 10.4
texture 10.4
computer 10.4
antique 10.4
door 10.2
paint 9.9
interior 9.7
movie 9.7
light 9.3
water 9.3
space 9.3
negative 9.3
travel 9.1
global 9.1
dirty 9
photographic 8.8
frames 8.8
noise 8.8
medium 8.8
slide 8.8
strip 8.7
damaged 8.6
tech 8.5
hand 8.3
transport 8.2
pattern 8.2
rough 8.2
border 8.1
reflection 8.1
close 8
noisy 7.9
scratches 7.9
scratch 7.8
layer 7.7
telecommunication system 7.7
rust 7.7
edge 7.7
outdoor 7.6
grungy 7.6
camera 7.4
street 7.4
room 7.3
people 7.2
decoration 7.2
sky 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

screenshot 98.4
text 96.5
billboard 94.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Male, 99.3%
Sad 98.4%
Calm 1.2%
Fear 0.1%
Confused 0.1%
Angry 0.1%
Disgusted 0.1%
Happy 0%
Surprised 0%

AWS Rekognition

Age 18-26
Gender Male, 98.2%
Calm 91.7%
Sad 2.6%
Disgusted 1.5%
Angry 1.4%
Fear 0.9%
Happy 0.8%
Confused 0.7%
Surprised 0.4%

AWS Rekognition

Age 6-16
Gender Male, 99.7%
Sad 42.2%
Angry 23.1%
Calm 21.8%
Happy 8.1%
Fear 2%
Surprised 1.2%
Disgusted 0.9%
Confused 0.6%

AWS Rekognition

Age 20-28
Gender Male, 92.3%
Calm 90.3%
Sad 4.2%
Angry 2.8%
Happy 1.1%
Disgusted 0.6%
Fear 0.5%
Surprised 0.3%
Confused 0.3%

AWS Rekognition

Age 21-29
Gender Male, 96.1%
Calm 84.9%
Sad 13.4%
Disgusted 0.4%
Confused 0.4%
Surprised 0.3%
Angry 0.2%
Happy 0.2%
Fear 0.2%

AWS Rekognition

Age 20-28
Gender Female, 51.4%
Calm 92.2%
Sad 4.6%
Angry 1.1%
Happy 0.7%
Surprised 0.5%
Disgusted 0.5%
Fear 0.3%
Confused 0.2%

AWS Rekognition

Age 25-35
Gender Female, 84.2%
Confused 28.7%
Calm 25.4%
Sad 19.5%
Happy 11.3%
Surprised 6.4%
Angry 3.4%
Disgusted 3.1%
Fear 2.3%

AWS Rekognition

Age 21-29
Gender Male, 98%
Calm 95.8%
Happy 2.2%
Fear 0.9%
Disgusted 0.5%
Sad 0.3%
Angry 0.1%
Surprised 0.1%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Poster 58.4%

Categories

Captions

Microsoft
created on 2022-01-08

graphical user interface 69.7%

Text analysis

Amazon

24
EASTMAN
DE
NO

Google

NG 24 24 EASTMAN
24
EASTMAN
NG