Human Generated Data

Title

Untitled (children group)

Date

c. 1900

People

Artist: J. Brenner, American active 1900s-1910s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.2914

Human Generated Data

Title

Untitled (children group)

People

Artist: J. Brenner, American active 1900s-1910s

Date

c. 1900

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.2914

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Clothing 99.7
Apparel 99.7
Person 98.7
Human 98.7
Person 98.5
Person 98.4
Person 97.3
Footwear 82
Overcoat 67.3
Coat 67.3
Boot 66.1
People 63.1
Door 62.6
Shoe 60.1

Clarifai
created on 2023-10-26

child 98.7
people 98.4
portrait 97.5
wear 96.3
painting 96.1
son 96
retro 95.8
art 95.7
sepia 95
vintage 93.6
old 93.5
nostalgia 93.3
memory 92.7
affection 91.9
documentary 90.7
man 89.4
antique 87.7
family 86.6
two 86.5
paper 84.4

Imagga
created on 2022-01-23

military uniform 56.7
uniform 45.4
clothing 33.2
old 30.6
architecture 25.2
ancient 25.1
covering 23.7
consumer goods 22.9
vintage 21.5
antique 20.9
building 20.8
statue 20.1
sculpture 18.8
history 18.8
historic 18.3
wall 18
stone 16.9
monument 16.8
tourism 16.5
art 15.9
religion 15.2
historical 15.1
travel 14.8
door 14.5
religious 14
window 13.8
culture 13.7
structure 12.9
grunge 12.8
detail 12.1
decoration 11.5
retro 11.5
famous 11.2
commodity 11.1
city 10.8
street 10.1
house 10
aged 10
dirty 9.9
architectural 9.6
kin 9.6
texture 9
landmark 9
wooden 8.8
home 8.8
arch 8.7
photograph 8.7
entrance 8.7
memorial 8.7
temple 8.7
design 8.4
frame 8.4
destination 8.4
black 8.4
church 8.3
traditional 8.3
carving 8.2
man 8.1
heritage 7.7
construction 7.7
brass 7.6
wood 7.5
symbol 7.4
tourist 7.4
holiday 7.2

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

clothing 99.3
person 98.5
wall 96.2
toddler 95.4
human face 95
old 89.9
baby 89.1
smile 88.8
boy 88.8
text 88.1
child 87.3
indoor 85.5
black 81.6
white 77
footwear 73.9
posing 70.5
picture frame 37.1
vintage 34.3
plaque 20.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 2-10
Gender Male, 87.7%
Calm 93.8%
Sad 3.9%
Confused 1.3%
Angry 0.5%
Fear 0.2%
Disgusted 0.2%
Happy 0.1%
Surprised 0.1%

AWS Rekognition

Age 2-8
Gender Male, 75.2%
Calm 98.8%
Sad 0.8%
Angry 0.2%
Confused 0.1%
Disgusted 0%
Happy 0%
Fear 0%
Surprised 0%

AWS Rekognition

Age 6-12
Gender Female, 100%
Calm 94%
Angry 3.4%
Sad 0.7%
Fear 0.6%
Confused 0.6%
Disgusted 0.3%
Surprised 0.2%
Happy 0.1%

AWS Rekognition

Age 1-7
Gender Female, 94.6%
Calm 99.5%
Confused 0.2%
Sad 0.1%
Happy 0.1%
Angry 0.1%
Disgusted 0%
Fear 0%
Surprised 0%

Microsoft Cognitive Services

Age 6
Gender Male

Microsoft Cognitive Services

Age 9
Gender Male

Microsoft Cognitive Services

Age 14
Gender Female

Microsoft Cognitive Services

Age 12
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.7%

Categories

Imagga

paintings art 99.4%

Text analysis

Amazon

NEW
43
ST.
A.COR.
YORK.
43 AVE NEW A.COR. YORK. 3RD ST.
3RD
AVE

Google

NEW YORK
NEW
YORK