Human Generated Data

Title

Untitled (photograph of three young children standing around baby in carriage)

Date

c. 1930, printed later

People

Artist: Curtis Studio, American active 1891 - 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13162

Human Generated Data

Title

Untitled (photograph of three young children standing around baby in carriage)

People

Artist: Curtis Studio, American active 1891 - 1935

Date

c. 1930, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13162

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.6
Human 99.6
Person 99.2
Person 95.6
Vehicle 91.3
Transportation 91.3
Wheel 90.6
Machine 90.6
Bicycle 81.1
Bike 81.1
Spoke 80.4
Art 69.9
Wheel 67.6
Clothing 65.6
Apparel 65.6
Text 59.9
Poster 56.9
Advertisement 56.9
Carriage 56.8
People 55.3

Clarifai
created on 2023-10-26

people 99.9
nostalgia 99.4
sepia 99.4
vintage 99.2
portrait 98.9
retro 98.5
antique 97.7
nostalgic 97.3
child 96.5
sepia pigment 96.5
centennial 96.2
two 95.7
art 95.5
old 95.4
lid 95.2
man 95.1
adult 94.6
group 93.6
print 92.8
vehicle 92.5

Imagga
created on 2022-01-22

sculpture 31.1
art 22
statue 22
architecture 21.1
old 20.2
ancient 19.9
tourism 17.3
antique 17.1
history 17
building 16.8
historic 16.5
window 16.2
stone 15.3
travel 14.8
monument 14
marble 13.7
religion 13.4
city 13.3
historical 13.2
wall 12.8
vintage 12.4
decoration 11.9
carving 11.7
famous 11.2
grunge 11.1
tourist 10.9
religious 10.3
drawing 10.3
culture 10.2
black 10.2
structure 9.9
detail 9.6
door 9.1
design 9.1
dirty 9
landmark 9
retro 9
man 8.7
temple 8.5
house 8.4
sconce 8.3
sketch 8.1
device 7.9
figure 7.8
frame 7.7
pattern 7.5
support 7.5
church 7.4
holiday 7.2
interior 7.1

Google
created on 2022-01-22

Wheel 89.5
Picture frame 88.5
Tire 85.4
Art 80.6
Vehicle 78.2
Vintage clothing 77.2
Chair 71.2
Stock photography 66.3
Poster 66.3
Toddler 65.1
Baby 61.6
Sitting 61.4
Oval 61.2
Visual arts 59.7
Illustration 59.4
History 57.7
Rectangle 57.7
Room 57
Baby Products 56.3
Painting 54.5

Microsoft
created on 2022-01-22

old 93.4
person 93.3
clothing 91.4
text 87
gallery 85.2
room 76.4
man 60.9
posing 54.9
wheel 50.5
vintage 47.7
picture frame 7.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 2-8
Gender Male, 99.8%
Happy 97.2%
Calm 1%
Confused 0.4%
Angry 0.4%
Surprised 0.4%
Sad 0.4%
Disgusted 0.2%
Fear 0.2%

AWS Rekognition

Age 6-14
Gender Male, 96.9%
Happy 93%
Fear 2.5%
Confused 1.3%
Disgusted 0.8%
Sad 0.8%
Surprised 0.7%
Angry 0.5%
Calm 0.4%

AWS Rekognition

Age 2-10
Gender Female, 100%
Confused 36.1%
Calm 16.9%
Happy 15.5%
Surprised 10.9%
Fear 7.5%
Angry 6.7%
Sad 5.7%
Disgusted 0.6%

Microsoft Cognitive Services

Age 7
Gender Female

Microsoft Cognitive Services

Age 5
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Wheel 90.6%
Poster 56.9%

Categories

Imagga

paintings art 100%

Text analysis

Amazon

KODYA