Human Generated Data

Title

Untitled (man dressed as Santa Claus on airfield talking to three men and child)

Date

c. 1945, printed later

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6750

Human Generated Data

Title

Untitled (man dressed as Santa Claus on airfield talking to three men and child)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1945, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6750

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Interior Design 99.8
Indoors 99.8
Person 99.1
Human 99.1
Person 99.1
Person 97.6
Poster 93.7
Advertisement 93.7
Collage 93.7
Home Decor 92.7
Person 92.7
Person 90.7
Person 86.3
Person 85.3
Person 84.8
Electronics 82.1
Screen 82.1
Display 79.1
Monitor 79.1
Person 70.8
Person 69.3
Person 68.3
Person 66.3
Person 65.4
Room 64.5
Text 62.2
Leisure Activities 60.7
LCD Screen 60.2
Person 59.7
Outdoors 55.4

Clarifai
created on 2019-11-16

people 98.9
many 94.9
man 94.5
group 94.2
no person 90.4
bill 90.4
adult 88.9
vehicle 88.6
street 87.8
group together 87.5
furniture 86.7
room 85.6
television 84.4
winter 82.9
wear 81.8
one 81
stock 80.8
exhibition 80.2
woman 79.4
outdoors 79.1

Imagga
created on 2019-11-16

billboard 100
signboard 100
structure 84.7
building 24.7
sky 19.8
sign 18.8
city 16.6
architecture 16.5
wall 16.3
old 13.9
street 13.8
window 13
exterior 12.9
house 12.5
glass 11.7
door 10.7
travel 10.6
urban 10.5
empty 10.3
office 9.1
art 8.5
tree 8.5
stone 8.4
frame 8.3
landscape 8.2
home 8
business 7.9
design 7.9
antique 7.8
windows 7.7
texture 7.6
shop 7.5
outdoors 7.5
symbol 7.4
light 7.4
barbershop 7.3
detail 7.2
road 7.2
facade 7.1
flag 7.1
information 7.1
rural 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 95.8
person 91.8
black and white 88.4
billboard 77.9
art 76.4
street 71.6
cartoon 63.2
gallery 51.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 36-52
Gender Male, 50.3%
Angry 49.6%
Surprised 49.6%
Sad 49.5%
Happy 49.5%
Calm 49.5%
Fear 49.7%
Confused 50%
Disgusted 49.5%

AWS Rekognition

Age 20-32
Gender Female, 54.2%
Happy 48.3%
Angry 45.9%
Disgusted 45.4%
Calm 48%
Fear 45.2%
Surprised 45.3%
Confused 45.4%
Sad 46.5%

AWS Rekognition

Age 23-35
Gender Female, 50.1%
Sad 49.8%
Surprised 49.5%
Confused 49.5%
Angry 49.5%
Calm 50%
Fear 49.5%
Happy 49.5%
Disgusted 49.5%

AWS Rekognition

Age 3-11
Gender Male, 50.3%
Calm 49.5%
Fear 50.3%
Confused 49.5%
Happy 49.5%
Angry 49.5%
Disgusted 49.5%
Sad 49.7%
Surprised 49.5%

AWS Rekognition

Age 2-8
Gender Female, 50%
Fear 50.1%
Happy 49.5%
Confused 49.5%
Calm 49.5%
Disgusted 49.5%
Sad 49.9%
Angry 49.6%
Surprised 49.5%

AWS Rekognition

Age 15-27
Gender Male, 50.3%
Happy 49.5%
Confused 49.6%
Calm 50.3%
Fear 49.5%
Angry 49.6%
Surprised 49.5%
Disgusted 49.5%
Sad 49.5%

Feature analysis

Amazon

Person 99.1%
Monitor 79.1%

Categories

Imagga

paintings art 99%

Captions

Microsoft
created on 2019-11-16

a sign on a window 67.8%
a sign on a window sill 66.3%
a sign in front of a window 66.2%