Human Generated Data

Title

Untitled (children and women outside house)

Date

1933-1934, printed later

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6790

Human Generated Data

Title

Untitled (children and women outside house)

People

Artist: Harry Annas, American 1897 - 1980

Date

1933-1934, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6790

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Poster 99.8
Collage 99.8
Advertisement 99.8
Mammal 94.9
Animal 94.9
Horse 94.9
Horse 93.6
Horse 88
Person 83.2
Human 83.2
Person 82
Horse 77.6
Person 74.9
Horse 71.7
Screen 70.5
Electronics 70.5
Person 70
Military 67.9
Person 64.9
Display 64.4
Monitor 64.4
Person 61.2
Person 60.7
People 59.5
Military Uniform 58.2
Brick 55.2
Person 47.5

Clarifai
created on 2019-11-16

monochrome 99.6
people 98.1
street 97.8
city 96.1
silhouette 95.2
light 94.4
man 93.5
dark 92.6
family 91.8
architecture 90.9
reflection 90.7
collage 90.5
cinematic 90.2
group 89.1
black and white 88.4
square 88.3
portfolio 87.9
analogue 87.6
winter 87.1
no person 86.8

Imagga
created on 2019-11-16

monitor 84.5
electronic equipment 60.2
television 48
equipment 38.8
broadcasting 30.1
telecommunication 22.7
grunge 20.4
screen 19.7
old 18.1
collage 17.3
black 16.8
film 16.2
texture 16
frame 15.8
sky 15.3
vintage 14.9
structure 14.6
paint 14.5
dirty 14.5
medium 14.4
art 14.3
computer 14.2
city 14.1
text 14
space 14
pattern 13.7
rough 13.7
border 13.6
your 13.5
design 13.5
architecture 13.3
graphic 13.1
digital 13
light 12.7
travel 12.7
billboard 12.6
movie 12.6
dark 12.5
night 12.4
retro 12.3
negative 11.9
landscape 11.9
building 11.9
scratches 11.8
frames 11.7
noise 11.7
liquid crystal display 11.6
rust 11.6
damaged 11.4
grungy 11.4
antique 11.2
noisy 10.8
designed 10.8
photographic 10.8
slide 10.7
scratch 10.7
material 10.7
strip 10.7
edge 10.6
urban 10.5
signboard 10.2
decoration 10.1
silhouette 9.9
layered 9.8
mess 9.8
office 9.7
window 9.7
messy 9.7
layer 9.7
mask 9.6
weathered 9.5
entertainment 9.2
street 9.2
telecommunication system 9
overlay 8.9
highly 8.9
detailed 8.7
dirt 8.6
evening 8.4
background 8.4
copy space 8.1
sunset 8.1
water 8
projects 7.9
high 7.8
glass 7.8
empty 7.7
modern 7.7
great 7.7
camera 7.4
nice 7.3
business 7.3
color 7.2
scenery 7.2
horizon 7.2
river 7.1
trees 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 96.9
black and white 72.8
white 71.1
horse 55.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 21-33
Gender Male, 50.5%
Happy 49.5%
Angry 49.9%
Disgusted 49.5%
Sad 49.6%
Surprised 49.5%
Calm 49.6%
Fear 49.8%
Confused 49.5%

Feature analysis

Amazon

Horse 94.9%
Person 83.2%

Categories

Captions

Text analysis

Amazon

TEAM
DOG
TAVERN
MEALS
MEALS DOG TEAM TAVERN ROOMS
INDUTRIES
GRENFELL
GRENFELL ADOR INDUTRIES
ADOR
ROOMS

Google

MEALS DOG TEAM TAVERN ROOMS GRENFELL LABRADOR INDUSTRIES EL
MEALS
DOG
TEAM
TAVERN
ROOMS
GRENFELL
LABRADOR
INDUSTRIES
EL