Human Generated Data

Title

Untitled (photograph of three young children standing around baby in carriage)

Date

c. 1930, printed later

People

Artist: Curtis Studio, American active 1891 - 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13163

Human Generated Data

Title

Untitled (photograph of three young children standing around baby in carriage)

People

Artist: Curtis Studio, American active 1891 - 1935

Date

c. 1930, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13163

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Wheel 99.8
Machine 99.8
Person 99.7
Human 99.7
Person 99.4
Musician 98.5
Musical Instrument 98.5
Person 97.3
Wheel 95.3
Wheel 94.1
Clothing 89.1
Apparel 89.1
Drummer 80.4
Percussion 80.4
Drum 77.5
Bicycle 68.5
Transportation 68.5
Vehicle 68.5
Bike 68.5

Clarifai
created on 2023-10-26

people 100
vintage 99.1
sepia 99.1
two 98.9
retro 98.7
sepia pigment 98.6
nostalgia 98.3
portrait 98.1
transportation system 97.6
carriage 97.3
adult 97.3
nostalgic 96.5
vehicle 96.2
old 95.1
group 95
street 94.9
memory 94.2
centennial 93.9
three 93.8
man 93.7

Imagga
created on 2022-01-22

architecture 37.2
wall 34.7
old 32.7
ancient 28.5
building 27.4
window 23.5
antique 22.2
stone 21.2
door 21.1
sculpture 20.9
art 19.5
texture 19.4
house 19.2
brick 19.1
vintage 19
structure 18.8
detail 18.5
decoration 18
historic 15.6
town 14.8
travel 14.8
entrance 14.5
aged 14.5
grunge 14.5
religion 14.3
history 14.3
carving 14.1
tourism 14
city 13.3
historical 13.2
exterior 12.9
tile 12.7
landmark 12.6
architectural 12.5
pattern 12.3
church 12
construction 12
gate 11.7
wood 11.7
frame 11.6
arch 11.6
design 11.3
monument 11.2
glass 10.9
wooden 10.5
statue 10.5
facade 10.4
home 10.4
religious 10.3
culture 10.3
famous 10.2
street 10.1
memorial 9.9
retro 9.8
device 9.7
metal 9.7
drawing 9.6
medieval 9.6
weathered 9.5
ornament 9.5
graffito 9.5
close 9.1
rough 9.1
dirty 9
brass 9
style 8.9
temple 8.8
textured 8.8
marble 8.7
windows 8.6
middle 8.6
yellow 8.6
damaged 8.6
balcony 8.6
destination 8.4
sketch 8.3
paint 8.1
material 8
paper 7.9
urban 7.9
worn 7.6
iron 7.6
details 7.5
brown 7.4
decor 7.1

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 97.2
old 90.7
clothing 86.2
person 86
bicycle 84.8
bicycle wheel 50.4
vintage 37.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 2-8
Gender Male, 75%
Happy 88.7%
Sad 3.6%
Calm 3%
Fear 1.7%
Angry 1%
Confused 0.9%
Surprised 0.8%
Disgusted 0.4%

AWS Rekognition

Age 6-14
Gender Female, 62.3%
Happy 99.2%
Confused 0.2%
Fear 0.1%
Surprised 0.1%
Disgusted 0.1%
Calm 0.1%
Sad 0.1%
Angry 0.1%

AWS Rekognition

Age 2-10
Gender Female, 99.9%
Confused 55.5%
Surprised 19.9%
Calm 9.1%
Fear 5.7%
Happy 4.1%
Angry 2.3%
Sad 2.2%
Disgusted 1.3%

AWS Rekognition

Age 0-4
Gender Female, 88%
Angry 96.8%
Confused 2.1%
Surprised 0.5%
Calm 0.2%
Sad 0.2%
Happy 0.1%
Fear 0.1%
Disgusted 0.1%

Microsoft Cognitive Services

Age 7
Gender Female

Microsoft Cognitive Services

Age 41
Gender Female

Microsoft Cognitive Services

Age 2
Gender Female

Microsoft Cognitive Services

Age 21
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Wheel 99.8%
Person 99.7%
Bicycle 68.5%

Categories

Imagga

paintings art 99.2%

Captions