Human Generated Data

Title

Untitled (couple talking to man in car towing a boat)

Date

1959

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8048

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (couple talking to man in car towing a boat)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1959

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Car 99.6
Transportation 99.6
Automobile 99.6
Vehicle 99.6
Person 94.9
Human 94.9
Road 87.3
Sedan 80.4
Car 79.3
Bumper 77.4
Person 73.3
Wheel 64.4
Machine 64.4
Outdoors 61.1
Tarmac 58.9
Asphalt 58.9
People 58.5
Fashion 56.2
Gown 56.2
Evening Dress 56.2
Clothing 56.2
Robe 56.2
Apparel 56.2
Nature 55.6
Street 55.3
Urban 55.3
Town 55.3
Building 55.3
City 55.3

Imagga
created on 2022-01-15

motor vehicle 100
car 92.2
golf equipment 51.8
beach wagon 49
wheeled vehicle 49
sports equipment 38.8
vehicle 33.3
road 27.1
transportation 26.9
equipment 26.2
travel 26.1
automobile 21.1
transport 20.1
drive 18.9
sea 18.8
sky 18.5
auto 18.2
water 16.7
street 16.6
speed 15.6
traffic 15.2
boat 14.9
landscape 14.9
highway 14.5
limousine 12.6
tourism 12.4
old 11.8
truck 11.8
city 11.6
ocean 11.6
driving 11.6
trees 11.6
shore 11.2
fast 11.2
building 11.1
tourist 10
summer 9.6
urban 9.6
beach 9.3
outdoor 9.2
racer 9.1
park 9.1
weather 9
coast 9
rural 8.8
engine 8.7
scene 8.7
wheel 8.6
ship 8.5
snow 8.5
winter 8.5
wood 8.3
vintage 8.3
vacation 8.2
sun 8.1
shopping cart 8
river 8
holiday 7.9
cloud 7.7
motor 7.7
motion 7.7
harbor 7.7
clouds 7.6
industrial 7.3
day 7.1
scenic 7
season 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 99.7
vehicle 92.3
car 88.7
land vehicle 81.3
black and white 73.9

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Female, 85.6%
Happy 89.8%
Angry 3.3%
Calm 2.8%
Surprised 2%
Disgusted 0.8%
Fear 0.7%
Confused 0.3%
Sad 0.3%

AWS Rekognition

Age 20-28
Gender Female, 97.8%
Happy 39.2%
Calm 36.3%
Sad 21.9%
Fear 1%
Confused 0.6%
Angry 0.5%
Disgusted 0.4%
Surprised 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Car 99.6%
Person 94.9%
Wheel 64.4%

Captions

Microsoft

an old photo of a person 55.3%
an old photo of a boat 31.6%
old photo of a person 31.5%

Text analysis

Amazon

43783

Google

43783
43783