Human Generated Data

Title

Untitled (children in a goat-drawn cart)

Date

c. 1900

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.2907

Human Generated Data

Title

Untitled (children in a goat-drawn cart)

People

Artist: Unidentified Artist,

Date

c. 1900

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.2907

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Wheel 99.6
Machine 99.6
Wheel 99.5
Person 99.2
Human 99.2
Person 99.1
Cow 97
Animal 97
Cattle 97
Mammal 97
Transportation 81.8
Vehicle 81.4
Wagon 55.3

Clarifai
created on 2023-10-25

people 99.5
sepia 99.4
cavalry 98.8
vintage 98.7
retro 97.7
sepia pigment 97.2
child 97.2
nostalgia 96
mammal 95.3
monochrome 95
two 94.5
group 94.4
nostalgic 93.5
wear 93.5
seated 93.2
old 92.2
son 92
family 91
art 90.8
man 90.8

Imagga
created on 2022-01-08

memorial 26.9
old 25.8
structure 24.5
sculpture 22
ancient 21.6
blackboard 20.8
statue 19.6
art 18.7
antique 18.3
vintage 18.2
history 17
stone 16.6
cemetery 16.5
historic 16.5
brass 16.3
grunge 16.2
aged 15.4
architecture 14.8
decoration 14
building 13.2
texture 13.2
retro 13.1
religion 12.5
head 11.8
design 11.3
monument 11.2
city 10.8
people 10.6
mask 10.5
paper 10.5
old fashioned 10.5
culture 10.3
marble 10.1
dirty 9.9
travel 9.9
damaged 9.5
weathered 9.5
man 9.5
historical 9.4
frame 9.2
traditional 9.1
tourism 9.1
detail 8.8
empty 8.6
person 8.6
famous 8.4
element 8.3
dress 8.1
figure 7.8
face 7.8
wall 7.8
carving 7.8
gravestone 7.7
outdoor 7.6
negative 7.6
groom 7.6
pattern 7.5
symbol 7.4
paint 7.2
landmark 7.2
sky 7
textured 7

Google
created on 2022-01-08

Photograph 94.2
Vertebrate 91.9
Wheel 90.5
Mammal 85.6
Tire 83.8
Working animal 83.5
Adaptation 79.2
People 78.4
Vintage clothing 76.6
Toddler 76.1
Cart 75
Pack animal 74.7
Snapshot 74.3
Art 71.6
Motor vehicle 71.2
Baby 67.8
Stock photography 64.8
Carriage 64.1
Sitting 63.9
Livestock 63.3

Microsoft
created on 2022-01-08

outdoor 99.5
text 98.5
horse 95
cart 94.9
old 83.9
person 77.2
child 76.8
wheel 58.6
land vehicle 56.9
vehicle 53.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 1-7
Gender Female, 99.8%
Sad 79.8%
Calm 16.2%
Fear 2.6%
Disgusted 0.3%
Angry 0.3%
Surprised 0.3%
Happy 0.3%
Confused 0.2%

AWS Rekognition

Age 2-8
Gender Female, 100%
Calm 99.2%
Angry 0.5%
Sad 0.1%
Confused 0.1%
Disgusted 0%
Happy 0%
Surprised 0%
Fear 0%

Microsoft Cognitive Services

Age 4
Gender Female

Microsoft Cognitive Services

Age 2
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Wheel 99.6%
Person 99.2%
Cow 97%

Categories

Imagga

paintings art 100%

Text analysis

Amazon

1936
SPORT

Google

1936
1936