Human Generated Data

Title

Untitled (standing woman and seated boy on street)

Date

c. 1950

People

Artist: Mary Lowber Tiers, American 1916 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15775

Human Generated Data

Title

Untitled (standing woman and seated boy on street)

People

Artist: Mary Lowber Tiers, American 1916 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15775

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 99.1
Human 99.1
Person 98.7
Clothing 98.6
Apparel 98.6
Person 85.6
Person 80.7
Person 79
Female 78
People 75.8
Car 72.6
Transportation 72.6
Vehicle 72.6
Automobile 72.6
Gown 69.4
Fashion 69.4
Person 67.2
Woman 66.3
Person 66.2
Road 63.4
Robe 62.9
Photography 60.7
Photo 60.7
Suit 60.4
Coat 60.4
Overcoat 60.4
Portrait 59.6
Face 59.6
Wedding 57.2
Tarmac 55.6
Asphalt 55.6
Person 51.2

Clarifai
created on 2023-10-29

people 99.9
child 99.2
two 98.4
adult 97.6
monochrome 96.1
street 96
man 95.9
family 95
group 94.5
woman 93.7
three 93.5
one 90.5
room 88.4
group together 88.3
boy 87
home 86.8
girl 83.1
furniture 81.5
wear 81.3
recreation 81.2

Imagga
created on 2022-02-05

newspaper 100
product 100
creation 95.2
old 18.8
man 16.1
paper 14.6
wall 14.5
people 14.5
grunge 14.5
architecture 14.1
antique 13.8
ancient 13.8
vintage 13.2
retro 13.1
business 12.7
aged 12.7
house 12.5
daily 12.4
paint 11.8
decoration 11.6
male 11.3
finance 11
currency 10.8
building 10.6
interior 10.6
person 10.2
art 9.8
home 9.6
design 9.6
money 9.4
portrait 9.1
dirty 9
pattern 8.9
culture 8.5
card 8.5
texture 8.3
frame 8.3
banking 8.3
room 8.2
dress 8.1
businessman 7.9
urban 7.9
adult 7.8
construction 7.7
bill 7.6
cash 7.3
success 7.2
color 7.2
office 7.2
bank 7.2
financial 7.1
family 7.1
day 7.1

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 98.9
outdoor 85.9
black and white 81.5
clothing 80.7
drawing 79.4
person 78.5
street 53.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Female, 56.1%
Sad 97.3%
Calm 1.8%
Confused 0.4%
Disgusted 0.2%
Surprised 0.1%
Fear 0.1%
Angry 0.1%
Happy 0.1%

AWS Rekognition

Age 52-60
Gender Female, 86.7%
Calm 94.9%
Sad 1.4%
Surprised 1%
Confused 0.8%
Happy 0.7%
Disgusted 0.5%
Angry 0.4%
Fear 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Car
Person 99.1%
Person 98.7%
Person 85.6%
Person 80.7%
Person 79%
Person 67.2%
Person 66.2%
Person 51.2%
Car 72.6%

Categories

Text analysis

Amazon

CO
DISION CO
DISION