Human Generated Data

Title

Untitled (male group portrait)

Date

1908

People

Artist: Arlington Studio, American active 1910s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.2922

Human Generated Data

Title

Untitled (male group portrait)

People

Artist: Arlington Studio, American active 1910s

Date

1908

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.2922

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.2
Human 99.2
Person 99.1
Person 98.8
Person 97.3
Person 95.6
Person 95.1
Person 94.4
Person 91.9
Person 86.3
Poster 81.5
Advertisement 81.5
Collage 77
People 74.8
Art 74.6
Military 71.4
Military Uniform 70.5
Furniture 60.2
Text 56.3

Clarifai
created on 2023-10-26

people 99.9
group 99.8
adult 98.5
documentary 98.4
print 97.8
child 96.6
man 96.5
art 95.5
woman 95.4
vintage 94.8
portrait 94.7
interaction 94.1
wear 93.4
scan 92.9
many 92.7
family 92.3
outfit 91.7
war 91.5
sepia 91.4
furniture 90.7

Imagga
created on 2022-01-23

brass 100
memorial 91.4
structure 67.1
old 44.6
grunge 41.7
texture 31.9
wall 31.8
antique 31.3
aged 29
vintage 28.9
ancient 28.5
dirty 23.5
stone 23.3
retro 22.9
grungy 19.9
textured 19.3
pattern 19.1
paper 17.2
architecture 17.2
brown 16.9
stained 15.4
rusty 15.2
material 15.2
history 14.3
weathered 14.2
art 14.2
empty 13.7
culture 13.7
rough 13.7
aging 13.4
surface 13.2
wallpaper 13
blank 12.9
paint 12.7
dirt 12.4
canvas 12.3
map 12.1
frame 11.7
rust 11.6
worn 11.5
damaged 11.4
decoration 11.4
backgrounds 11.4
text 11.3
sculpture 11.2
atlas 11.1
global 10.9
design 10.7
carving 10.6
geography 10.6
color 10.6
stain 10.6
parchment 10.6
building 10.4
gravestone 10.3
earth 10.2
space 10.1
world 9.9
travel 9.9
burnt 9.7
decay 9.6
north 9.6
bill 9.5
monument 9.3
page 9.3
exterior 9.2
artwork 9.2
border 9
detail 8.8
torn 8.7
obsolete 8.6
concrete 8.6
nation 8.5
poster 8.5
graffito 8.3
grain 8.3
note 8.3
country 7.9
urban 7.9
continent 7.8
messy 7.7
china 7.7
international 7.6
england 7.6
temple 7.6
textures 7.6
planet 7.5
decorative 7.5
backdrop 7.4
globe 7.4
symbol 7.4
letter 7.3

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 92.2
person 91.1
old 79
clothing 78.2
painting 63.9
posing 61.1
picture frame 46
plaque 27.5
stone 4.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 40-48
Gender Male, 98.2%
Happy 96.6%
Calm 1.2%
Confused 0.8%
Angry 0.5%
Sad 0.3%
Disgusted 0.2%
Fear 0.2%
Surprised 0.1%

AWS Rekognition

Age 25-35
Gender Male, 100%
Calm 98.6%
Angry 0.5%
Confused 0.3%
Happy 0.2%
Sad 0.2%
Disgusted 0.1%
Surprised 0.1%
Fear 0.1%

AWS Rekognition

Age 24-34
Gender Male, 100%
Calm 98.9%
Angry 0.4%
Happy 0.3%
Confused 0.1%
Surprised 0.1%
Sad 0.1%
Fear 0.1%
Disgusted 0%

AWS Rekognition

Age 18-26
Gender Male, 100%
Calm 99.5%
Sad 0.2%
Angry 0.1%
Happy 0.1%
Fear 0%
Confused 0%
Surprised 0%
Disgusted 0%

AWS Rekognition

Age 19-27
Gender Female, 99.1%
Calm 98.4%
Happy 1%
Sad 0.2%
Confused 0.1%
Disgusted 0.1%
Surprised 0.1%
Angry 0.1%
Fear 0%

AWS Rekognition

Age 42-50
Gender Male, 100%
Confused 94.7%
Calm 1.9%
Fear 1.6%
Sad 0.8%
Angry 0.5%
Happy 0.2%
Disgusted 0.2%
Surprised 0.1%

AWS Rekognition

Age 23-31
Gender Male, 99.9%
Calm 98.8%
Confused 0.5%
Happy 0.3%
Sad 0.2%
Fear 0.1%
Angry 0.1%
Surprised 0.1%
Disgusted 0.1%

AWS Rekognition

Age 50-58
Gender Male, 100%
Calm 75.5%
Angry 8%
Confused 5.6%
Surprised 2.8%
Sad 2.7%
Fear 2.2%
Disgusted 1.9%
Happy 1.3%

AWS Rekognition

Age 30-40
Gender Male, 99.9%
Happy 90.5%
Calm 5.4%
Confused 1.5%
Surprised 0.7%
Angry 0.7%
Disgusted 0.5%
Sad 0.4%
Fear 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Categories

Imagga

paintings art 99.9%

Text analysis

Amazon

Studio