Human Generated Data

Title

Social Settlements: United States. Pennsylvania. Philadelphia. The Starr Centre Association: The Starr Centre Association, Philadelphia, Pa.: A fee of ten cents is charged those able to pay.

Date

c. 1907

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Social Museum Collection, 3.2002.359.2

Human Generated Data

Title

Social Settlements: United States. Pennsylvania. Philadelphia. The Starr Centre Association: The Starr Centre Association, Philadelphia, Pa.: A fee of ten cents is charged those able to pay.

People

Artist: Unidentified Artist,

Date

c. 1907

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Social Museum Collection, 3.2002.359.2

Machine Generated Data

Tags

Amazon
created on 2019-06-05

Person 99
Human 99
Art 83.4
Painting 83.4
Person 71.3
Person 66.1

Clarifai
created on 2019-06-05

people 100
group 99.7
adult 99.3
group together 98.8
furniture 98.3
two 98
man 97.4
administration 97.3
one 96.8
room 95.9
four 95.9
leader 95.8
many 95.6
three 95.5
several 95.3
seat 94
military 93.8
war 93.2
sit 93
wear 91.2

Imagga
created on 2019-06-05

crutch 26.2
painter 24.7
old 20.9
staff 20.2
man 20.1
weapon 17.3
ancient 17.3
stick 16.2
device 15.4
stone 14.7
person 14.6
history 14.3
instrument 13.7
wall 13.7
male 13.5
architecture 13.3
historical 13.2
metal 12.9
electric chair 12.8
culture 12
building 11.9
instrument of execution 11.7
mask 11.4
grunge 11.1
sword 10.9
industrial 10.9
sculpture 10.7
people 10.6
war 10.6
art 10.5
portrait 10.3
men 10.3
historic 10.1
dark 10
protection 10
travel 9.9
adult 9.2
black 9
military 8.7
light 8.7
work 8.6
statue 8.6
industry 8.5
traditional 8.3
inside 8.3
dirty 8.1
worker 8
soldier 7.8
face 7.8
antique 7.8
house 7.5
smoke 7.4
holding 7.4
tourism 7.4
body 7.2
steel 7.1

Google
created on 2019-06-05

Microsoft
created on 2019-06-05

person 92.6
clothing 92
old 78.8
photograph 56.9
man 53.2
vintage 27.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 49-69
Gender Male, 98.9%
Surprised 1.2%
Sad 65.2%
Happy 2.9%
Confused 4.3%
Calm 23.1%
Angry 2.3%
Disgusted 1%

AWS Rekognition

Age 20-38
Gender Female, 50.1%
Confused 45.8%
Disgusted 45.3%
Surprised 45.5%
Sad 46.1%
Angry 45.8%
Happy 45.2%
Calm 51.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%
Painting 83.4%

Categories

Imagga

paintings art 99.9%

Captions

Microsoft
created on 2019-06-05

a vintage photo of a person 85.6%
a vintage photo of a person 82.7%
an old photo of a person 82.6%

Text analysis

Amazon

she

Google

82
82