Human Generated Data

Title

Untitled (people eating and drinking on a small boat)

Date

1963

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8107

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (people eating and drinking on a small boat)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1963

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8107

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.7
Human 99.7
Person 99.6
Person 98.6
Boat 98.4
Transportation 98.4
Vehicle 98.4
Person 98.3
Person 94.4
Person 93.4
Person 92.6
Dining Table 89.1
Table 89.1
Furniture 89.1
Meal 87.5
Food 87.5
Person 82.4
Person 80.4
Chair 79.5
Watercraft 76.8
Vessel 76.8
Outdoors 74.5
Person 72.1
Water 68.5
Restaurant 65.8
Word 65.4
Nature 64.5
People 64.4
Shirt 57.5
Clothing 57.5
Apparel 57.5
Beverage 55.7
Drink 55.7
Text 55.6

Clarifai
created on 2023-10-26

people 99.9
watercraft 99.7
group 99.6
group together 99.4
vehicle 99.1
adult 98.9
many 97.8
man 97.6
transportation system 96.6
rowboat 95.1
monochrome 94.6
woman 94.6
boatman 92.8
several 91.9
child 91.8
recreation 90.1
lifeboat 86.4
wear 85.8
military 83.9
five 80.4

Imagga
created on 2022-01-15

sea 63.1
boat 62.8
ocean 54.2
water 47.4
vessel 45.8
ship 42.5
harbor 28.9
travel 28.9
port 27
sky 26.1
beach 23.6
coast 22.4
tourism 21.4
vacation 20.5
boats 19.4
island 19.2
bay 19
sailing 18.5
dock 18.5
transport 18.3
transportation 17.9
shore 17.7
yacht 17.2
container 17.1
brass 16.4
craft 16
nautical 15.5
fisherman 15.2
summer 14.8
pier 14.6
fishing 14.4
sunset 14.4
wind instrument 14
waves 13.9
ships 13.7
marina 13.7
sail 13.6
fireboat 13.5
horizon 13.5
musical instrument 13
tray 12.9
tourist 12.7
cruise 12.6
coastline 12.2
landscape 11.9
clouds 11.8
city 11.6
man 11.5
holiday 11.5
sand 11.3
lake 11.2
outdoors 11.2
luxury 11.1
ferry 10.8
quay 10.8
shipping 10.8
silhouette 10.8
cargo 10.7
cornet 10.6
receptacle 10.4
day 10.2
people 10
outdoor 9.9
sun 9.7
wave 9.5
bottle 9.4
landmark 9
seaside 9
mast 8.8
urban 8.7
pacific 8.7
seascape 8.6
marine 8.6
skyline 8.5
industry 8.5
bridge 8.5
relax 8.4
famous 8.4
leisure 8.3
deck 8.3
recreation 8.1
water bottle 8
vehicle 8
oceans 7.9
wharf 7.9
twilight 7.8
sunny 7.7
navigation 7.7
sax 7.6
trombone 7.6
evening 7.5
warship 7.3
steel drum 7.2
building 7.1
river 7.1
architecture 7

Microsoft
created on 2022-01-15

water 97.5
text 97.1
outdoor 96.8
ship 92.6
person 90
clothing 79.6
boat 78.5
lake 78.2
watercraft 75.7
people 70.8
man 68.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Male, 94.6%
Calm 99.8%
Happy 0.1%
Sad 0.1%
Confused 0%
Disgusted 0%
Angry 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 37-45
Gender Male, 55.6%
Calm 86.4%
Happy 9.1%
Surprised 1.4%
Confused 1.1%
Disgusted 0.6%
Angry 0.5%
Sad 0.5%
Fear 0.4%

AWS Rekognition

Age 49-57
Gender Male, 98%
Calm 90.2%
Sad 6.3%
Confused 2.1%
Angry 0.6%
Happy 0.4%
Disgusted 0.1%
Surprised 0.1%
Fear 0.1%

AWS Rekognition

Age 45-53
Gender Male, 97.8%
Sad 69%
Calm 15.8%
Happy 8.1%
Disgusted 2.4%
Confused 1.9%
Angry 1.4%
Fear 0.9%
Surprised 0.6%

AWS Rekognition

Age 18-26
Gender Male, 97.9%
Sad 63.3%
Fear 21.2%
Calm 8.1%
Happy 2.8%
Angry 2.1%
Confused 1.4%
Disgusted 0.7%
Surprised 0.4%

AWS Rekognition

Age 21-29
Gender Female, 69.7%
Sad 34.7%
Happy 34.4%
Calm 18.1%
Angry 6.8%
Confused 2.4%
Fear 1.6%
Surprised 1.1%
Disgusted 0.8%

AWS Rekognition

Age 18-26
Gender Female, 65.9%
Calm 86.5%
Sad 6.3%
Happy 3%
Confused 1.3%
Angry 1.2%
Disgusted 0.7%
Surprised 0.6%
Fear 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Feature analysis

Amazon

Person 99.7%
Boat 98.4%

Categories

Text analysis

Amazon

49147.

Google

५१५7.
५१५7.