Human Generated Data

Title

Untitled (contact sheet)

Date

20th century

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4615

Human Generated Data

Title

Untitled (contact sheet)

People

Artist: Ben Shahn, American 1898 - 1969

Date

20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4615

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Interior Design 99.9
Indoors 99.9
Electronics 99.6
Display 99.6
Screen 99.6
Person 99.5
Human 99.5
Monitor 98.7
LCD Screen 97.3
Person 97.3
Person 96.5
Person 93.7
Television 87.2
TV 87.2
Person 84.3
Person 72.9
Computer 70.9
Room 59.2
Theater 57.5

Clarifai
created on 2023-10-25

screen 99.5
people 99.4
television 98.2
technology 98.1
computer 97.3
display 96.2
monitor 95.6
man 95.6
movie 95.5
portrait 94.6
art 94.4
window 93.7
collage 92.4
monochrome 92.3
one 91.7
indoors 91.6
analogue 90.9
adult 90
video 89.9
baby 87.9

Imagga
created on 2022-01-08

television 100
monitor 78.1
broadcasting 59.1
telecommunication system 56.8
telecommunication 44.5
computer 38.5
screen 37.8
equipment 37.1
electronic equipment 36.3
technology 33.4
medium 29.3
business 29.1
office 27.3
display 26
laptop 23.7
flat 19.3
electronic 18.7
communication 18.5
modern 16.1
hand 15.9
keyboard 15
work 14.9
people 14.5
man 14.1
person 13.6
information 13.3
working 13.3
businessman 13.2
desk 13.2
notebook 13.1
digital 13
video 12.6
desktop 12.5
design 11.8
black 11.4
home 11.2
object 11
adult 11
finance 11
room 10.9
plasma 10.7
media 10.5
one 10.4
tech 10.4
sitting 10.3
close 10.3
back 10.1
global 10
frame 10
male 9.9
liquid crystal 9.9
financial 9.8
looking 9.6
corporate 9.4
professional 9.3
data 9.1
interior 8.8
indoors 8.8
panel 8.7
blank 8.6
space 8.5
web 8.5
horizontal 8.4
studio 8.4
network 8.3
entertainment 8.3
single 8.2
film 7.9
smile 7.8
portable 7.8
movie 7.8
3d 7.7
chart 7.6
wireless 7.6
workplace 7.6
electronics 7.6
contemporary 7.5
happy 7.5
success 7.2

Microsoft
created on 2022-01-08

monitor 99.7
screenshot 98
text 96.7
television 94.7
indoor 88.3
screen 77.2
computer 70.5
microwave 36.2
flat 33.3
set 31.7
display 28.5
kitchen appliance 13.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 51-59
Gender Male, 92.4%
Calm 64.2%
Sad 19%
Confused 6.7%
Surprised 4.6%
Disgusted 2.3%
Angry 1.7%
Happy 0.9%
Fear 0.5%

AWS Rekognition

Age 23-33
Gender Male, 58.2%
Calm 92.9%
Sad 2.9%
Happy 2%
Fear 1%
Angry 0.4%
Confused 0.4%
Disgusted 0.3%
Surprised 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Monitor 98.7%

Categories

Imagga

text visuals 60.6%
paintings art 22.6%
interior objects 11.9%
food drinks 2.8%

Text analysis

Amazon

SUPER
23
SUPER XX
XX

Google

23 SUPER XX
23
SUPER
XX