Human Generated Data

Title

Untitled (contact sheet)

Date

20th century

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4620

Human Generated Data

Title

Untitled (contact sheet)

People

Artist: Ben Shahn, American 1898 - 1969

Date

20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4620

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.8
Human 99.8
Person 99.1
Person 98.8
Person 96.5
Person 94.7
Monitor 89.4
Electronics 89.4
Screen 89.4
Display 89.4
Person 87.9
Clothing 79.9
Apparel 79.9
Coat 55.5

Clarifai
created on 2023-10-25

movie 99.8
negative 99.6
filmstrip 99.2
people 98.9
slide 98.9
exposed 97
woman 96.4
cinematography 96.2
noisy 96.1
collage 95.8
screen 95.3
analogue 95.3
photograph 94.2
art 93.6
adult 93.3
window 92.7
portrait 92.6
wear 92.5
monochrome 91.4
video 89.7

Imagga
created on 2022-01-08

negative 82.6
film 73.1
photographic paper 46.8
photographic equipment 31.2
business 20
architecture 18
urban 16.6
building 16.3
city 15.8
people 15.6
art 13.8
silhouette 13.2
man 12.8
black 12.6
design 12.4
equipment 11.9
male 11.3
office 11.2
vintage 10.7
retro 10.6
web site 10.5
group 10.5
old 10.4
grunge 10.2
symbol 10.1
strip 9.7
flag 9.4
light 9.4
occupation 9.2
locker 9
electronic equipment 8.6
crowd 8.6
monitor 8.4
house 8.4
frame 8.3
camera 8.3
data 8.2
information 8
businessman 7.9
screen 7.9
movie 7.7
travel 7.7
texture 7.6
technology 7.4
structure 7.4
entertainment 7.4
letter 7.3
team 7.2
tower 7.2
fastener 7.2
interior 7.1
cinema 7
modern 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 96.5
screenshot 87.6
wedding dress 59.7
billboard 53.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 36-44
Gender Male, 100%
Calm 89.4%
Angry 3.6%
Sad 2.2%
Confused 1.6%
Happy 1%
Fear 0.9%
Disgusted 0.8%
Surprised 0.5%

AWS Rekognition

Age 0-6
Gender Male, 92.7%
Calm 77.2%
Disgusted 9.7%
Sad 7.3%
Angry 2.1%
Surprised 1%
Happy 1%
Confused 0.8%
Fear 0.8%

AWS Rekognition

Age 26-36
Gender Male, 81.9%
Calm 99.3%
Happy 0.6%
Confused 0%
Disgusted 0%
Surprised 0%
Sad 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 56-64
Gender Female, 89.2%
Sad 53.6%
Calm 18.9%
Happy 8.4%
Confused 6.6%
Fear 3.5%
Surprised 3.1%
Angry 3%
Disgusted 2.9%

AWS Rekognition

Age 21-29
Gender Female, 50.4%
Sad 46%
Calm 40.4%
Confused 10%
Angry 1.5%
Disgusted 1%
Fear 0.6%
Surprised 0.4%
Happy 0.2%

AWS Rekognition

Age 42-50
Gender Female, 52.4%
Calm 92.8%
Sad 5.1%
Surprised 0.8%
Angry 0.4%
Disgusted 0.3%
Happy 0.3%
Confused 0.2%
Fear 0.1%

AWS Rekognition

Age 24-34
Gender Male, 99.6%
Calm 69.1%
Disgusted 25.2%
Happy 1.4%
Confused 1%
Sad 0.9%
Surprised 0.9%
Fear 0.8%
Angry 0.8%

AWS Rekognition

Age 2-8
Gender Female, 88.6%
Sad 72.9%
Calm 20.4%
Angry 3.9%
Disgusted 0.9%
Confused 0.6%
Fear 0.6%
Happy 0.4%
Surprised 0.4%

Microsoft Cognitive Services

Age 52
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very likely

Feature analysis

Amazon

Person 99.8%
Monitor 89.4%

Text analysis

Amazon

11