Human Generated Data

Title

Untitled (employees posing with Santa Claus in front of tree and creche)

Date

1940

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3791

Human Generated Data

Title

Untitled (employees posing with Santa Claus in front of tree and creche)

People

Artist: Durette Studio, American 20th century

Date

1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3791

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Person 98.3
Human 98.3
Person 97.3
Plant 92.4
Tree 92.4
Person 90.6
Person 89.7
Interior Design 89.6
Indoors 89.6
Person 82.2
Room 80.7
Person 75.2
Person 74.6
Person 74
Person 72.8
Person 72.5
People 72.1
Lobby 59.4
Clinic 56.1
Person 54.4

Clarifai
created on 2019-06-01

people 99.4
adult 97.3
room 96.5
group 94.9
man 94.2
indoors 93.1
woman 92.8
illustration 86.9
window 86
family 85.8
many 84.1
modern 82.3
child 81.2
furniture 79.9
art 79.1
home 78.7
group together 78.4
house 78.4
light 78
wear 76.8

Imagga
created on 2019-06-01

grunge 40.9
sketch 38.7
drawing 38.4
design 28.1
texture 25.7
art 25.3
frame 24
representation 23.1
retro 21.3
pattern 21.2
decoration 20.9
silhouette 19.9
old 19.5
wallpaper 19.2
vintage 19
graphic 18.2
dirty 18.1
style 17.8
modern 16.8
paint 16.3
shape 15.7
aged 15.4
creative 15
backdrop 14.8
flower 14.6
floral 14.5
artistic 13.9
color 13.4
splash 13.2
border 12.7
leaf 12.5
space 12.4
element 12.4
black 12
template 11.9
paper 11.7
ink 11.6
grungy 11.4
banner 11
decorative 10.9
holiday 10.8
light 10.7
funky 10.6
stain 10.6
backgrounds 10.5
digital 10.5
textured 10.5
poster 10.4
graphics 10.3
map 10.2
swirl 10.1
global 10
antique 9.9
splat 9.8
card 9.8
splatter 9.8
scroll 9.5
curl 9.5
blank 9.4
line 9.4
symbol 9.4
halftone 9.4
elements 9.3
painting 9.2
sign 9
plant 9
cool 8.9
urban 8.7
text 8.7
spring 8.6
season 8.6
motion 8.6
berry 8.3
artwork 8.2
transparent 8.1
celebration 8
architecture 7.8
ornament 7.8
summer 7.7
construction 7.7
dirt 7.6
elegance 7.6
power 7.6
clean 7.5
simple 7.5
page 7.4
globe 7.4
gymnasium 7.4
window 7.4
smooth 7.3
business 7.3
copy 7.1

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

window 92.6
christmas tree 82.6
black and white 76.6
old 74.3
white 62.3
group 59.7
posing 50.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Female, 52.2%
Disgusted 45.1%
Sad 46.9%
Happy 45.2%
Confused 45.2%
Surprised 45.4%
Angry 45.2%
Calm 51.9%

AWS Rekognition

Age 16-27
Gender Female, 54.9%
Sad 47.1%
Happy 45.8%
Surprised 45.6%
Calm 49.7%
Disgusted 45.3%
Confused 46.1%
Angry 45.5%

AWS Rekognition

Age 38-59
Gender Female, 50.2%
Happy 45.3%
Surprised 45.5%
Angry 45.4%
Confused 45.7%
Calm 47.4%
Sad 47.4%
Disgusted 48.2%

AWS Rekognition

Age 35-52
Gender Male, 52.4%
Sad 51.3%
Calm 46.2%
Surprised 45.3%
Angry 45.3%
Disgusted 45.3%
Happy 46.4%
Confused 45.2%

AWS Rekognition

Age 48-68
Gender Female, 52%
Sad 47.7%
Calm 47%
Surprised 45.9%
Angry 46.9%
Disgusted 45.7%
Happy 45.3%
Confused 46.6%

AWS Rekognition

Age 17-27
Gender Male, 54.6%
Disgusted 45.3%
Calm 51.1%
Confused 45.5%
Sad 46.9%
Surprised 45.4%
Angry 45.4%
Happy 45.5%

AWS Rekognition

Age 26-43
Gender Female, 51.6%
Confused 46.3%
Surprised 45.7%
Angry 45.8%
Calm 49%
Disgusted 45.3%
Sad 47.4%
Happy 45.4%

AWS Rekognition

Age 35-52
Gender Male, 53.2%
Disgusted 45.1%
Sad 47.1%
Happy 45.3%
Surprised 45.1%
Calm 51.8%
Angry 45.2%
Confused 45.5%

AWS Rekognition

Age 26-43
Gender Male, 51.7%
Confused 45.1%
Happy 45.5%
Surprised 45.3%
Angry 45.2%
Disgusted 45.4%
Calm 52.9%
Sad 45.7%

Feature analysis

Amazon

Person 98.3%

Categories

Imagga

interior objects 99.9%