Human Generated Data

Title

Untitled (man escorting woman out of large car with two other women beside him at entrance to estate)

Date

1940-1960

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10029

Human Generated Data

Title

Untitled (man escorting woman out of large car with two other women beside him at entrance to estate)

People

Artist: Martin Schweig, American 20th century

Date

1940-1960

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-28

Person 99.7
Human 99.7
Person 98.5
Person 98.3
Person 97.5
Vehicle 95.9
Transportation 95.9
Clothing 92.4
Apparel 92.4
Automobile 89.3
Wheel 87.4
Machine 87.4
Wheel 85.4
Car 84.8
Spoke 81.8
Person 77.5
Outdoors 73.2
Tree 72
Plant 72
Female 68
Shorts 65.6
Sedan 58.4
Bike 56.6
Bicycle 56.6
Tire 56.4
Overcoat 55.1
Coat 55.1
Dress 55.1

Imagga
created on 2022-01-28

tricycle 73.5
wheeled vehicle 69.2
vehicle 47.7
bench 37.5
park bench 36.3
conveyance 32.2
seat 26.2
architecture 18.1
building 17.5
furniture 17.3
man 16.1
street 15.6
old 15.3
city 15
people 14.5
outdoors 13.9
outdoor 13.8
travel 13.4
male 12.8
wheelchair 12
handcart 11.7
park 11.5
transportation 10.8
color 10.6
ancient 10.4
house 10
chair 9.5
water 9.3
black 9
history 8.9
interior 8.8
scene 8.7
day 8.6
tree 8.5
winter 8.5
tourism 8.2
light 8
night 8
furnishing 8
barrow 8
disabled 7.9
couple 7.8
person 7.8
culture 7.7
sky 7.6
shopping cart 7.6
lamp 7.6
statue 7.6
senior 7.5
monument 7.5
tourist 7.2
road 7.2
home 7.2
art 7.2
portrait 7.1
adult 7.1
summer 7.1

Google
created on 2022-01-28

Microsoft
created on 2022-01-28

tree 99.9
text 98.1
outdoor 97.7
snow 88.5
black and white 88.4
white 72.3
street 68.5
monochrome 56.8

Face analysis

Google

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Wheel 87.4%
Car 84.8%

Captions

Microsoft

an old photo of a street 81.2%
old photo of a street 76.4%
a person riding on the back of a truck 27%

Text analysis

Amazon

RODVK--E.VEEIA--EITW