Human Generated Data

Title

Untitled (boy and girl in garden pointing at each other)

Date

c. 1950, printed later

People

Artist: Jack Rodden Studio, American 1914 - 2016

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13827

Human Generated Data

Title

Untitled (boy and girl in garden pointing at each other)

People

Artist: Jack Rodden Studio, American 1914 - 2016

Date

c. 1950, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13827

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Poster 100
Advertisement 100
Collage 100
Human 99.6
Person 99.6
Person 99.6
Person 99.6
Person 99.3
Vehicle 91.3
Boat 91.3
Transportation 91.3
Electronics 71.4
Screen 71.4
Display 61.1
Monitor 57.2
Watercraft 56.9
Vessel 56.9

Clarifai
created on 2019-11-16

people 98.9
monochrome 98.8
group 96.3
nature 96.2
city 95.9
landscape 95.2
collage 94.9
winter 94.8
man 94.4
water 94.1
television 93.9
snow 93.1
window 91.6
architecture 91.4
exhibition 91.1
art 90.6
adult 90.5
river 90.2
portrait 89.7
street 88.9

Imagga
created on 2019-11-16

windowsill 35.2
screen 35
grunge 32.3
sill 28.1
vintage 27.3
window screen 26.9
old 24.4
texture 22.2
structural member 21.2
retro 20.5
antique 19.9
frame 19.3
pattern 19.1
dirty 19
protective covering 18.4
film 17.9
black 17.4
paint 17.2
damaged 17.2
grungy 17.1
art 17
graphic 16
text 15.7
rough 15.5
rust 15.4
border 15.4
support 15.3
material 15.2
structure 15.2
covering 15
space 14.7
design 14.6
aged 14.5
weathered 14.2
aquarium 13.7
textured 13.1
negative 12.8
digital 12.2
wall 12
window 11.7
messy 11.6
collage 11.6
billboard 11.1
decoration 11.1
silhouette 10.8
scratch 10.7
dirt 10.5
ancient 10.4
paper 10.2
people 10
color 10
noisy 9.9
computer 9.9
photographic 9.8
noise 9.8
slide 9.8
edge 9.6
man 9.4
device 9.1
signboard 9
office 8.9
designed 8.9
frames 8.8
urban 8.7
movie 8.7
your 8.7
male 8.5
dark 8.3
sky 8.3
monitor 8.3
building 8.1
water 8
overlay 7.9
scratches 7.9
layered 7.9
mess 7.9
architecture 7.8
strip 7.8
layer 7.7
modern 7.7
mask 7.7
world 7.6
rusty 7.6
poster 7.6
backdrop 7.4
camera 7.4
interior 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

person 94.9
black and white 93.2
text 93.2
indoor 90.8
gallery 89.4
clothing 80.3
monochrome 68.6
man 66.2
water 65.6
room 56.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 30-46
Gender Female, 52.4%
Disgusted 45%
Happy 53.2%
Confused 45%
Angry 45.5%
Fear 45.1%
Sad 45.3%
Calm 45.7%
Surprised 45.1%

AWS Rekognition

Age 12-22
Gender Male, 50.3%
Disgusted 49.5%
Calm 50.4%
Fear 49.5%
Happy 49.5%
Confused 49.5%
Sad 49.5%
Surprised 49.5%
Angry 49.5%

Feature analysis

Amazon

Person 99.6%
Boat 91.3%
Monitor 57.2%

Categories

Captions

Microsoft
created on 2019-11-16

a screen shot of a television 49.3%