Human Generated Data

Title

Untitled (contact sheet)

Date

20th century

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4625

Human Generated Data

Title

Untitled (contact sheet)

People

Artist: Ben Shahn, American 1898 - 1969

Date

20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4625

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 98.8
Human 98.8
Person 98.5
Person 98.5
Person 94
Person 87.6
Military Uniform 87.4
Military 87.4
Person 86.3
Nature 81.3
Outdoors 75.2
People 71.8
Person 71.6
Person 67.3
Brick 60.2
Officer 59.7
Ice 58.3
Army 58.2
Armored 58.2
Snow 57.7
Sailor Suit 57.7

Clarifai
created on 2023-10-25

movie 99.9
negative 99.9
filmstrip 99.7
slide 99.3
collage 99.3
noisy 98.8
cinematography 98.5
people 98.4
exposed 98.4
photograph 97.8
screen 95.1
sliding 91.9
art 91.8
retro 91.6
old 91.3
emulsion 91.2
vintage 90.8
margin 90.7
video 89.6
group 89.1

Imagga
created on 2022-01-08

case 71.9
film 37.8
negative 32.7
equipment 21.8
business 17
digital 16.2
architecture 15.6
computer 15.3
old 14.6
photographic paper 14.4
art 14.3
building 13.9
black 13.8
sequencer 13.7
technology 13.3
monitor 13
finance 12.7
strip 12.6
movie 12.6
vintage 12.4
retro 12.3
grunge 11.9
car 11.6
screen 11.5
apparatus 10.9
city 10.8
night 10.6
collage 10.6
camera 10.2
freight car 10.1
design 10.1
electronic equipment 10
cinema 9.9
urban 9.6
photographic equipment 9.6
dollar 9.3
frame 9.1
information 8.8
photographic 8.8
light 8.7
close 8.6
money 8.5
texture 8.3
entertainment 8.3
data 8.2
global 8.2
border 8.1
object 8.1
man 8.1
financial 8
filmstrip 7.9
text 7.9
slide 7.8
structure 7.8
industry 7.7
photograph 7.7
bill 7.6
hand 7.6
roll 7.6
lights 7.4
closeup 7.4
symbol 7.4
cash 7.3
paint 7.2
currency 7.2
bank 7.2

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 96.7
screenshot 87.8
picture frame 68.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 34-42
Gender Male, 99.9%
Calm 48.7%
Happy 47.3%
Confused 1.2%
Surprised 1%
Sad 0.5%
Fear 0.5%
Angry 0.4%
Disgusted 0.4%

AWS Rekognition

Age 36-44
Gender Male, 99.6%
Happy 92.9%
Calm 2.6%
Surprised 1.1%
Angry 1%
Disgusted 1%
Sad 0.6%
Confused 0.5%
Fear 0.4%

AWS Rekognition

Age 48-54
Gender Male, 99.9%
Calm 86%
Angry 3.9%
Disgusted 3.9%
Sad 2.7%
Confused 1.5%
Surprised 0.8%
Happy 0.7%
Fear 0.6%

AWS Rekognition

Age 52-60
Gender Male, 99.7%
Calm 37.3%
Sad 33.7%
Happy 27.1%
Angry 0.6%
Confused 0.4%
Fear 0.3%
Surprised 0.3%
Disgusted 0.3%

AWS Rekognition

Age 60-70
Gender Male, 99.9%
Calm 96.9%
Sad 1.8%
Angry 0.4%
Confused 0.2%
Disgusted 0.2%
Surprised 0.2%
Fear 0.1%
Happy 0.1%

AWS Rekognition

Age 33-41
Gender Male, 99.6%
Calm 64.1%
Sad 33%
Confused 2%
Angry 0.3%
Disgusted 0.2%
Happy 0.2%
Surprised 0.1%
Fear 0.1%

AWS Rekognition

Age 19-27
Gender Male, 99.4%
Calm 97.8%
Sad 0.9%
Angry 0.5%
Surprised 0.3%
Disgusted 0.2%
Happy 0.1%
Confused 0.1%
Fear 0%

AWS Rekognition

Age 34-42
Gender Male, 97.2%
Calm 97.7%
Sad 1.7%
Confused 0.2%
Happy 0.2%
Disgusted 0.1%
Surprised 0.1%
Angry 0%
Fear 0%

Microsoft Cognitive Services

Age 55
Gender Male

Microsoft Cognitive Services

Age 40
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%

Captions

Text analysis

Amazon

20

Google

20 II: 20
20
II: