Human Generated Data

Title

Untitled (large group of children with parents? in front of doorway on street)

Date

c. 1950

People

Artist: Mary Lowber Tiers, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15748

Human Generated Data

Title

Untitled (large group of children with parents? in front of doorway on street)

People

Artist: Mary Lowber Tiers, American active 1940s

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Human 99.6
Person 99.6
Person 99.4
Person 98.4
Person 98.3
Person 95.9
Person 89.7
People 88.5
Person 83
Person 82.2
Person 79.9
Clothing 79.4
Apparel 79.4
Door 70.4
Floor 68.7
Shorts 64.7
Kid 63.8
Child 63.8
Baby 63.4
Road 63.3
Face 62.6
Advertisement 59.5
Building 57.9
Poster 57.2
Housing 55.7

Imagga
created on 2022-02-05

barbershop 100
shop 84.7
mercantile establishment 66.4
place of business 44.3
old 27.8
negative 26.7
window 23.9
film 22.2
establishment 22.1
newspaper 20.7
architecture 20.3
wall 19.7
vintage 19
house 18.4
building 16.8
retro 16.4
grunge 16.2
product 15.9
antique 14.7
city 14.1
home 13.5
windows 13.4
frame 13.3
glass 13.2
texture 13.2
door 13.1
ancient 13
photographic paper 12.7
creation 12.3
street 12
historic 11.9
aged 11.8
dirty 11.7
black 11.4
decoration 11.1
art 11
structure 10.7
light 10.7
travel 10.6
urban 10.5
design 10.1
damaged 9.5
tourism 9.1
interior 8.8
mask 8.6
empty 8.6
grungy 8.5
photographic equipment 8.5
screen 8.3
pattern 8.2
history 8
detail 8
card 7.6
room 7.5
daily 7.5
brown 7.4
water 7.3
digital 7.3
rough 7.3
paint 7.2
material 7.1
sliding door 7.1
textured 7

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 99.8
building 99.1
window 92
person 90.2
outdoor 88.7
clothing 85.4
house 82.7
old 55.9

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Female, 97.9%
Calm 76.5%
Happy 14.7%
Sad 3.8%
Angry 1.3%
Disgusted 1.2%
Surprised 1.2%
Fear 0.9%
Confused 0.4%

AWS Rekognition

Age 51-59
Gender Male, 89.4%
Calm 69.6%
Surprised 11.7%
Confused 5.6%
Happy 4.7%
Sad 2.7%
Fear 2.6%
Angry 1.8%
Disgusted 1.2%

AWS Rekognition

Age 42-50
Gender Female, 66.9%
Sad 27.8%
Happy 23.5%
Surprised 21.3%
Calm 19.7%
Angry 2.4%
Fear 1.9%
Confused 1.7%
Disgusted 1.7%

AWS Rekognition

Age 22-30
Gender Male, 98.8%
Calm 82.7%
Sad 7%
Confused 2.7%
Surprised 2.7%
Angry 2%
Happy 1.3%
Disgusted 1%
Fear 0.7%

AWS Rekognition

Age 20-28
Gender Female, 90.9%
Calm 72.5%
Sad 20.5%
Happy 2.2%
Confused 1.5%
Angry 0.9%
Surprised 0.9%
Disgusted 0.8%
Fear 0.6%

AWS Rekognition

Age 14-22
Gender Male, 51.7%
Calm 88.2%
Sad 6.7%
Surprised 1.5%
Angry 1.1%
Confused 0.8%
Happy 0.6%
Fear 0.6%
Disgusted 0.5%

AWS Rekognition

Age 13-21
Gender Female, 99.9%
Happy 53.6%
Calm 22.9%
Sad 14.2%
Disgusted 4.4%
Angry 2.4%
Fear 1%
Surprised 0.8%
Confused 0.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Captions

Microsoft

a vintage photo of a group of people standing in front of a window 84.1%
a vintage photo of a group of people in front of a window 84%
a vintage photo of a group of people sitting in front of a window 78.2%