Human Generated Data

Title

Untitled (group of people watching street portrait artist)

Date

c.1950

People

Artist: Mary Lowber Tiers, American 1916 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15710

Human Generated Data

Title

Untitled (group of people watching street portrait artist)

People

Artist: Mary Lowber Tiers, American 1916 - 1985

Date

c.1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15710

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 98.9
Human 98.9
Person 98.8
Person 96.1
Wheel 95.5
Machine 95.5
Person 94.8
Person 94.3
Wheel 93.2
Person 92.4
Person 88.2
Motorcycle 87.9
Transportation 87.9
Vehicle 87.9
Clothing 85
Apparel 85
Person 82.6
Advertisement 81.2
Poster 78.1
People 77.9
Wheel 74.2
Person 72.7
Furniture 69.3
Face 66.8
Person 65.2
Person 63.2
Female 63.1
Table 61.8
Person 61.6
Drawing 59.4
Art 59.4
Collage 58.5
Shoe 57.9
Footwear 57.9
Girl 56.6
Text 56
Paper 55.4

Clarifai
created on 2023-10-29

people 99.9
group 98.8
adult 97.1
many 96.8
group together 96.8
wear 96
child 95.1
administration 93.9
woman 93
man 92.7
nostalgia 92
vehicle 91.7
war 90.4
nostalgic 90.2
military 89.9
outfit 88.7
uniform 87
monochrome 87
boy 86.1
music 85.4

Imagga
created on 2022-02-05

newspaper 58.6
product 45.3
creation 35.4
barbershop 18.9
shop 17.6
modern 14.7
equipment 14.6
art 14.5
design 13.5
technology 13.3
mercantile establishment 13.3
glass 12.2
digital 12.1
monitor 11.8
city 11.6
interior 11.5
stall 10.9
building 10.7
retro 10.6
business 10.3
house 10
texture 9.7
home 9.6
computer 8.9
window 8.9
science 8.9
room 8.8
place of business 8.8
light 8.7
contemporary 8.5
communication 8.4
metal 8
daily 8
black 7.8
architecture 7.8
industry 7.7
pattern 7.5
decoration 7.3
data 7.3
new 7.3
colorful 7.2
restaurant 7.2
information 7.1
working 7.1
work 7.1
display 7
person 7

Google
created on 2022-02-05

Tire 87.5
Style 83.8
Black-and-white 83.8
Dress 83.5
Table 80.1
Wheel 79.8
Adaptation 79.2
Chair 75.8
Motor vehicle 74.7
Monochrome 74.7
Snapshot 74.3
Window 73.2
Monochrome photography 72
Event 71.9
Art 71.5
Suit 71.4
Room 70.2
Vintage clothing 68
Illustration 67
Font 66.7

Microsoft
created on 2022-02-05

text 98.9
person 97.8
outdoor 92.6
clothing 84.1
people 82.3
group 64.4
crowd 47.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 53-61
Gender Female, 61.6%
Calm 59%
Sad 34.6%
Confused 2%
Happy 1.5%
Disgusted 0.9%
Fear 0.9%
Surprised 0.6%
Angry 0.3%

AWS Rekognition

Age 25-35
Gender Male, 93.1%
Sad 64.8%
Calm 33%
Confused 1.1%
Angry 0.3%
Surprised 0.3%
Disgusted 0.2%
Happy 0.2%
Fear 0.1%

AWS Rekognition

Age 52-60
Gender Female, 87.8%
Calm 77.1%
Confused 12.5%
Sad 4.6%
Angry 1.8%
Surprised 1.3%
Happy 1.3%
Disgusted 0.9%
Fear 0.5%

AWS Rekognition

Age 25-35
Gender Female, 72.4%
Sad 68.3%
Calm 28.9%
Disgusted 0.7%
Confused 0.5%
Happy 0.5%
Angry 0.5%
Surprised 0.4%
Fear 0.2%

AWS Rekognition

Age 36-44
Gender Male, 98.6%
Calm 94.2%
Surprised 1.8%
Confused 1.1%
Sad 1%
Happy 0.9%
Disgusted 0.5%
Fear 0.2%
Angry 0.2%

AWS Rekognition

Age 23-33
Gender Male, 85.9%
Calm 93.5%
Surprised 3.3%
Happy 1.1%
Confused 0.8%
Disgusted 0.5%
Sad 0.5%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 20-28
Gender Female, 65.6%
Calm 97.4%
Sad 2%
Happy 0.1%
Confused 0.1%
Disgusted 0.1%
Surprised 0.1%
Angry 0%
Fear 0%

AWS Rekognition

Age 47-53
Gender Female, 97.2%
Calm 60.3%
Happy 33%
Sad 2.8%
Surprised 2.5%
Fear 0.5%
Disgusted 0.3%
Confused 0.3%
Angry 0.3%

AWS Rekognition

Age 51-59
Gender Male, 88.3%
Calm 95.1%
Sad 2.9%
Confused 0.9%
Happy 0.5%
Disgusted 0.2%
Surprised 0.2%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 14-22
Gender Male, 86.6%
Calm 98.8%
Surprised 0.5%
Sad 0.2%
Confused 0.1%
Disgusted 0.1%
Happy 0.1%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 42-50
Gender Female, 71.5%
Calm 65.5%
Surprised 12.8%
Happy 10.6%
Disgusted 3.5%
Fear 3.2%
Sad 2.7%
Confused 1%
Angry 0.6%

AWS Rekognition

Age 22-30
Gender Male, 83.6%
Calm 54.7%
Surprised 31.4%
Fear 7.8%
Confused 2.6%
Disgusted 1.3%
Angry 1.1%
Happy 0.6%
Sad 0.5%

AWS Rekognition

Age 41-49
Gender Female, 56.8%
Calm 97.4%
Sad 1.3%
Surprised 0.4%
Confused 0.2%
Angry 0.2%
Happy 0.1%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 22-30
Gender Female, 51.3%
Sad 43.6%
Surprised 32.2%
Calm 7.9%
Happy 6%
Confused 3.8%
Angry 2.4%
Disgusted 2.3%
Fear 1.8%

AWS Rekognition

Age 39-47
Gender Male, 98.9%
Calm 94.9%
Surprised 4%
Happy 0.4%
Angry 0.2%
Confused 0.1%
Fear 0.1%
Sad 0.1%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Wheel
Motorcycle
Shoe
Person 98.9%
Person 98.8%
Person 96.1%
Person 94.8%
Person 94.3%
Person 92.4%
Person 88.2%
Person 82.6%
Person 72.7%
Person 65.2%
Person 63.2%
Person 61.6%
Wheel 95.5%
Wheel 93.2%
Wheel 74.2%
Motorcycle 87.9%
Shoe 57.9%

Categories