Human Generated Data

Title

Untitled (contact sheet)

Date

20th century

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4608.1-3

Human Generated Data

Title

Untitled (contact sheet)

People

Artist: Ben Shahn, American 1898 - 1969

Date

20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4608.1-3

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Interior Design 100
Indoors 100
Monitor 99.7
Display 99.7
Screen 99.7
Electronics 99.7
Person 95.6
Human 95.6
Room 94.5
LCD Screen 91.8
Person 91.2
Theater 89.5
Cinema 82.9
Person 82.6
Person 76.9
Phone 68.3
TV 64.6
Television 64.6
Person 64
Mobile Phone 56
Cell Phone 56
Person 53.6

Clarifai
created on 2023-10-25

people 98.3
window 97.9
street 97.7
landscape 95.2
light 95
art 94.6
snow 92.7
rain 91.9
man 91.6
city 91
nature 90.8
winter 90.8
smoke 89.6
movie 87.8
abstraction 87.6
weather 87.3
no person 86.6
analogue 86.5
architecture 86.5
screen 85.2

Imagga
created on 2022-01-08

aquarium 66.5
television 56.7
broadcasting 37
monitor 36.6
telecommunication 29
electronic equipment 23.3
equipment 22.3
medium 17.4
architecture 16.4
interior 15
technology 14.8
screen 14.7
modern 14
business 14
design 12.4
office 12.1
digital 12.1
black 12
home 12
building 11.8
case 11.7
display 11.3
electronic 11.2
film 10.9
light 10
hand 9.9
3d 9.3
furniture 9.2
house 9.2
city 9.1
night 8.9
urban 8.7
glass 8.5
people 8.4
entertainment 8.3
structure 8.1
computer 8.1
space 7.7
finance 7.6
tech 7.6
communication 7.5
elegance 7.5
reflection 7.3
room 7.3
art 7.1
work 7.1
sky 7

Microsoft
created on 2022-01-08

screenshot 95.5
text 95
television 88.6
computer 78.9
billboard 75.4
picture frame 59.6
display 54.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-31
Gender Male, 91.9%
Calm 86.1%
Happy 7.8%
Angry 2%
Sad 1.6%
Confused 0.8%
Surprised 0.6%
Disgusted 0.6%
Fear 0.3%

Feature analysis

Amazon

Monitor 99.7%
Person 95.6%

Categories

Imagga

interior objects 97.1%
text visuals 1.9%

Captions

Text analysis

Amazon

10
9
6
11

Google

9 10
9
10