Human Generated Data

Title

Untitled (family eating under awning in front of trailer home)

Date

1952

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8707

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (family eating under awning in front of trailer home)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8707

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.6
Human 99.6
Person 97.9
Person 91.6
Person 90.9
Furniture 82.3
Clothing 78.9
Apparel 78.9
Table 75.1
Chair 70.6
Aircraft 70.3
Vehicle 70.3
Transportation 70.3
Person 68
Person 65.8
Building 65.6
Biplane 63.2
Airplane 62.8
People 61.3
Face 59.2
Shorts 57
Text 56.4

Clarifai
created on 2023-10-25

people 99.9
group together 99.7
aircraft 99.4
vehicle 99
airplane 98.3
group 97.9
transportation system 97
military 96.7
man 95.6
three 95.6
adult 95.2
chair 94.8
music 94
two 93.2
airport 93
aviate 92.6
several 92.3
war 91.8
woman 91.4
monochrome 91

Imagga
created on 2022-01-09

television 92.4
broadcasting 37.8
telecommunication system 36.8
telecommunication 28.4
people 19
medium 18.3
silhouette 17.4
man 16.8
person 15.9
technology 14.8
crowd 14.4
player 14.1
television camera 14.1
equipment 13.6
stadium 13.6
electronic equipment 13.1
flag 12.8
male 12.8
business 12.7
monitor 12.5
nation 12.3
symbol 12.1
lights 12
black 12
competition 11.9
design 11.8
cheering 11.7
world 11.7
audience 11.7
transportation 11.6
sport 11.6
shoot 11.6
patriotic 11.5
television equipment 11.3
device 11.2
icon 11.1
glowing 11.1
nighttime 10.8
science 10.7
vibrant 10.5
athlete 10.4
industry 10.2
championship 9.7
adult 9.7
group 9.7
skill 9.6
shiny 9.5
training 9.2
occupation 9.2
digital 8.9
match 8.7
muscular 8.6
effects 8.5
three dimensional 8.4
event 8.3
global 8.2
industrial 8.2
team 8.1
bright 7.9
ball 7.9
court 7.8
3d 7.7
station 7.7
imagination 7.6
park 7.5
field 7.5
teamwork 7.4
globe 7.4
graphics 7.3
playing 7.3
computer 7.3
protection 7.3
danger 7.3
cockpit 7.1
job 7.1
work 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 99.6
black and white 88
outdoor 88
person 80.9
clothing 64.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Female, 98.3%
Calm 99.9%
Sad 0.1%
Happy 0%
Confused 0%
Disgusted 0%
Surprised 0%
Fear 0%
Angry 0%

AWS Rekognition

Age 24-34
Gender Female, 83.6%
Calm 74.9%
Happy 18.8%
Sad 3.4%
Confused 1.4%
Surprised 0.6%
Angry 0.4%
Disgusted 0.4%
Fear 0.2%

AWS Rekognition

Age 24-34
Gender Female, 77%
Calm 73.4%
Sad 22%
Happy 1.4%
Angry 1.1%
Disgusted 0.9%
Surprised 0.4%
Confused 0.4%
Fear 0.3%

AWS Rekognition

Age 30-40
Gender Male, 98.4%
Calm 94.4%
Happy 3.6%
Sad 0.6%
Fear 0.6%
Surprised 0.2%
Disgusted 0.2%
Angry 0.2%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Chair 70.6%
Airplane 62.8%

Captions

Text analysis

Amazon

39847

Google

39847
39847