Human Generated Data

Title

Untitled (family outside next to car)

Date

1970s copy negative from a c. 1935 negative

People

Artist: C. Bennette Moore, American 1879 - 1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21770

Human Generated Data

Title

Untitled (family outside next to car)

People

Artist: C. Bennette Moore, American 1879 - 1939

Date

1970s copy negative from a c. 1935 negative

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21770

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Person 99.7
Human 99.7
Dress 99.6
Clothing 99.6
Apparel 99.6
Car 99.5
Automobile 99.5
Vehicle 99.5
Transportation 99.5
Person 99.4
Person 99.2
Wheel 95.8
Machine 95.8
Female 92.1
Face 83.8
Bumper 83.4
Bridegroom 80.9
Wedding 80.9
Gown 77.3
Fashion 77.3
Woman 76
Robe 75.2
Outdoors 75.1
Hot Rod 72.6
Grass 70.5
Plant 70.5
Person 69.1
Portrait 68.5
Photography 68.5
Photo 68.5
Evening Dress 63
Nature 62.1
Wedding Gown 60.7
Antique Car 58.5
Tire 55.2

Clarifai
created on 2023-10-22

people 99.9
adult 97.9
vehicle 97.4
group 97
woman 93.2
group together 91.6
man 90.5
transportation system 89.8
car 84.8
wedding 84.6
child 84
leader 82.7
two 81.1
canine 80.7
several 80.6
retro 80.3
three 79.4
portrait 78.1
nostalgia 77.9
bride 77.1

Imagga
created on 2022-03-11

car 56
vehicle 40.3
motor vehicle 30.6
groom 24.5
wheeled vehicle 21.7
auto 20.1
truck 19.5
sky 19.2
road 19
transportation 18.8
landscape 17.1
grass 16.6
outdoor 16.1
military vehicle 15.5
automobile 15.3
transport 14.6
sunset 13.5
field 13.4
travel 13.4
wheel 13.3
summer 12.9
old 12.5
rural 12.3
adult 12.3
people 12.3
sun 12.1
jeep 11.5
man 11.4
country 11.4
drive 11.4
person 11.3
happiness 11
limousine 10.9
farm 10.7
love 10.3
mobile home 10.2
two 10.2
beach 10.1
danger 10
outdoors 9.8
male 9.2
silhouette 9.1
industrial 9.1
bride 8.8
happy 8.8
couple 8.7
extreme 8.6
outside 8.6
trailer 8.5
pretty 8.4
horizontal 8.4
sand 8.3
park 8.2
countryside 8.2
vacation 8.2
coast 8.1
housing 8
structure 8
water 8
4x4 7.9
sea 7.8
black 7.8
accident 7.8
cloud 7.7
attractive 7.7
dirt 7.6
rusty 7.6
adventure 7.6
sunrise 7.5
desert 7.5
world 7.4
tourist 7.4
wedding 7.4
bus 7.4
yellow 7.3
romantic 7.1
sunlight 7.1

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

text 99.1
land vehicle 93.2
vehicle 93.1
car 86.6
clothing 81.5
person 80
dress 77.2
wheel 67.8
white 62.1
old 52.5
picture frame 16.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 50-58
Gender Male, 80.8%
Calm 37.4%
Happy 33.3%
Sad 26%
Confused 1.2%
Surprised 0.9%
Angry 0.6%
Disgusted 0.5%
Fear 0.2%

AWS Rekognition

Age 31-41
Gender Male, 99%
Calm 97.4%
Happy 1.9%
Sad 0.2%
Confused 0.2%
Surprised 0.1%
Disgusted 0.1%
Angry 0%
Fear 0%

AWS Rekognition

Age 20-28
Gender Female, 72.9%
Happy 80.2%
Calm 14.6%
Sad 2%
Confused 1.1%
Surprised 0.7%
Disgusted 0.6%
Angry 0.5%
Fear 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Car
Wheel
Person 99.7%
Person 99.4%
Person 99.2%
Person 69.1%
Car 99.5%
Wheel 95.8%

Categories

Imagga

paintings art 98.1%

Captions

Microsoft
created on 2022-03-11

a vintage photo of a person 78.7%
a vintage photo of a girl 65.6%
a vintage photo of a person 65.5%